Apr 20 15:58:33 user nova-compute[71605]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. Apr 20 15:58:35 user nova-compute[71605]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=71605) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=71605) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=71605) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Apr 20 15:58:36 user nova-compute[71605]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.031s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 15:58:36 user nova-compute[71605]: INFO nova.virt.driver [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] Loading compute driver 'libvirt.LibvirtDriver' Apr 20 15:58:36 user nova-compute[71605]: INFO nova.compute.provider_config [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] Acquiring lock "singleton_lock" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] Acquired lock "singleton_lock" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] Releasing lock "singleton_lock" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] Full set of CONF: {{(pid=71605) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ******************************************************************************** {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] Configuration options gathered from: {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] command line args: ['--config-file', '/etc/nova/nova-cpu.conf'] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] config files: ['/etc/nova/nova-cpu.conf'] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ================================================================================ {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] allow_resize_to_same_host = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] arq_binding_timeout = 300 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] backdoor_port = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] backdoor_socket = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] block_device_allocate_retries = 300 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] block_device_allocate_retries_interval = 5 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cert = self.pem {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] compute_driver = libvirt.LibvirtDriver {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] compute_monitors = [] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] config_dir = [] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] config_drive_format = iso9660 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] config_file = ['/etc/nova/nova-cpu.conf'] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] config_source = [] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] console_host = user {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] control_exchange = nova {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cpu_allocation_ratio = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] daemon = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] debug = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] default_access_ip_network_name = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] default_availability_zone = nova {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] default_ephemeral_format = ext4 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] default_schedule_zone = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] disk_allocation_ratio = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] enable_new_services = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] enabled_apis = ['osapi_compute'] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] enabled_ssl_apis = [] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] flat_injected = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] force_config_drive = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] force_raw_images = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] graceful_shutdown_timeout = 5 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] heal_instance_info_cache_interval = 60 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] host = user {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] initial_cpu_allocation_ratio = 4.0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] initial_disk_allocation_ratio = 1.0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] initial_ram_allocation_ratio = 1.0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] instance_build_timeout = 0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] instance_delete_interval = 300 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] instance_format = [instance: %(uuid)s] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] instance_name_template = instance-%08x {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] instance_usage_audit = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] instance_usage_audit_period = month {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] instances_path = /opt/stack/data/nova/instances {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] internal_service_availability_zone = internal {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] key = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] live_migration_retry_count = 30 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] log_config_append = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] log_dir = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] log_file = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] log_options = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] log_rotate_interval = 1 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] log_rotate_interval_type = days {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] log_rotation_type = none {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] long_rpc_timeout = 1800 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] max_concurrent_builds = 10 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] max_concurrent_live_migrations = 1 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] max_concurrent_snapshots = 5 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] max_local_block_devices = 3 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] max_logfile_count = 30 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] max_logfile_size_mb = 200 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] maximum_instance_delete_attempts = 5 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] metadata_listen = 0.0.0.0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] metadata_listen_port = 8775 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] metadata_workers = 3 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] migrate_max_retries = -1 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] mkisofs_cmd = genisoimage {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] my_block_storage_ip = 10.0.0.210 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] my_ip = 10.0.0.210 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] network_allocate_retries = 0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] osapi_compute_listen = 0.0.0.0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] osapi_compute_listen_port = 8774 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] osapi_compute_unique_server_name_scope = {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] osapi_compute_workers = 3 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] password_length = 12 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] periodic_enable = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] periodic_fuzzy_delay = 60 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] pointer_model = ps2mouse {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] preallocate_images = none {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] publish_errors = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] pybasedir = /opt/stack/nova {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ram_allocation_ratio = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] rate_limit_burst = 0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] rate_limit_except_level = CRITICAL {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] rate_limit_interval = 0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] reboot_timeout = 0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] reclaim_instance_interval = 0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] record = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] reimage_timeout_per_gb = 20 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] report_interval = 10 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] rescue_timeout = 0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] reserved_host_cpus = 0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] reserved_host_disk_mb = 0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] reserved_host_memory_mb = 512 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] reserved_huge_pages = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] resize_confirm_window = 0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] resize_fs_using_block_device = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] resume_guests_state_on_host_boot = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] rpc_response_timeout = 60 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] run_external_periodic_tasks = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] running_deleted_instance_action = reap {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] running_deleted_instance_poll_interval = 1800 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] running_deleted_instance_timeout = 0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] scheduler_instance_sync_interval = 120 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] service_down_time = 60 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] servicegroup_driver = db {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] shelved_offload_time = 0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] shelved_poll_interval = 3600 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] shutdown_timeout = 0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] source_is_ipv6 = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ssl_only = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] state_path = /opt/stack/data/nova {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] sync_power_state_interval = 600 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] sync_power_state_pool_size = 1000 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] syslog_log_facility = LOG_USER {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] tempdir = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] timeout_nbd = 10 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] transport_url = **** {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] update_resources_interval = 0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] use_cow_images = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] use_eventlog = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] use_journal = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] use_json = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] use_rootwrap_daemon = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] use_stderr = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] use_syslog = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vcpu_pin_set = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vif_plugging_is_fatal = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vif_plugging_timeout = 0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] virt_mkfs = [] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] volume_usage_poll_interval = 0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] watch_log_file = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] web = /usr/share/spice-html5 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_concurrency.disable_process_locking = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_concurrency.lock_path = /opt/stack/data/nova {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_metrics.metrics_process_name = {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api.auth_strategy = keystone {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api.compute_link_prefix = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api.dhcp_domain = novalocal {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api.enable_instance_password = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api.glance_link_prefix = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api.instance_list_cells_batch_strategy = distributed {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api.instance_list_per_project_cells = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api.list_records_by_skipping_down_cells = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api.local_metadata_per_cell = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api.max_limit = 1000 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api.metadata_cache_expiration = 15 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api.neutron_default_tenant_id = default {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api.use_forwarded_for = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api.use_neutron_default_nets = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api.vendordata_dynamic_failure_fatal = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api.vendordata_dynamic_ssl_certfile = {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api.vendordata_dynamic_targets = [] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api.vendordata_jsonfile_path = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api.vendordata_providers = ['StaticJSON'] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.backend = dogpile.cache.memcached {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.backend_argument = **** {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.config_prefix = cache.oslo {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.dead_timeout = 60.0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.debug_cache_backend = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.enable_retry_client = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.enable_socket_keepalive = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.enabled = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.expiration_time = 600 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.hashclient_retry_attempts = 2 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.hashclient_retry_delay = 1.0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.memcache_dead_retry = 300 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.memcache_password = {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.memcache_pool_maxsize = 10 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.memcache_pool_unused_timeout = 60 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.memcache_sasl_enabled = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.memcache_servers = ['localhost:11211'] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.memcache_socket_timeout = 1.0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.memcache_username = {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.proxies = [] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.retry_attempts = 2 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.retry_delay = 0.0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.socket_keepalive_count = 1 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.socket_keepalive_idle = 1 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.socket_keepalive_interval = 1 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.tls_allowed_ciphers = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.tls_cafile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.tls_certfile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.tls_enabled = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cache.tls_keyfile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cinder.auth_section = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cinder.auth_type = password {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cinder.cafile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cinder.catalog_info = volumev3::publicURL {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cinder.certfile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cinder.collect_timing = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cinder.cross_az_attach = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cinder.debug = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cinder.endpoint_template = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cinder.http_retries = 3 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cinder.insecure = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cinder.keyfile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cinder.os_region_name = RegionOne {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cinder.split_loggers = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cinder.timeout = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] compute.cpu_dedicated_set = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] compute.cpu_shared_set = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] compute.image_type_exclude_list = [] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] compute.live_migration_wait_for_vif_plug = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] compute.max_concurrent_disk_ops = 0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] compute.max_disk_devices_to_attach = -1 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] compute.resource_provider_association_refresh = 300 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] compute.shutdown_retry_interval = 10 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] conductor.workers = 3 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] console.allowed_origins = [] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] console.ssl_ciphers = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] console.ssl_minimum_version = default {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] consoleauth.token_ttl = 600 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cyborg.cafile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cyborg.certfile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cyborg.collect_timing = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cyborg.connect_retries = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cyborg.connect_retry_delay = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cyborg.endpoint_override = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cyborg.insecure = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cyborg.keyfile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cyborg.max_version = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cyborg.min_version = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cyborg.region_name = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cyborg.service_name = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cyborg.service_type = accelerator {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cyborg.split_loggers = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cyborg.status_code_retries = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cyborg.status_code_retry_delay = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cyborg.timeout = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] cyborg.version = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] database.backend = sqlalchemy {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] database.connection = **** {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] database.connection_debug = 0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] database.connection_parameters = {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] database.connection_recycle_time = 3600 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] database.connection_trace = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] database.db_inc_retry_interval = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] database.db_max_retries = 20 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] database.db_max_retry_interval = 10 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] database.db_retry_interval = 1 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] database.max_overflow = 50 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] database.max_pool_size = 5 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] database.max_retries = 10 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] database.mysql_enable_ndb = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] database.mysql_sql_mode = TRADITIONAL {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] database.mysql_wsrep_sync_wait = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] database.pool_timeout = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] database.retry_interval = 10 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] database.slave_connection = **** {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] database.sqlite_synchronous = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api_database.backend = sqlalchemy {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api_database.connection = **** {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api_database.connection_debug = 0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api_database.connection_parameters = {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api_database.connection_recycle_time = 3600 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api_database.connection_trace = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api_database.db_inc_retry_interval = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api_database.db_max_retries = 20 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api_database.db_max_retry_interval = 10 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api_database.db_retry_interval = 1 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api_database.max_overflow = 50 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api_database.max_pool_size = 5 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api_database.max_retries = 10 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api_database.mysql_enable_ndb = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api_database.mysql_wsrep_sync_wait = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api_database.pool_timeout = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api_database.retry_interval = 10 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api_database.slave_connection = **** {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] api_database.sqlite_synchronous = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] devices.enabled_mdev_types = [] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ephemeral_storage_encryption.enabled = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ephemeral_storage_encryption.key_size = 512 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.api_servers = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.cafile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.certfile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.collect_timing = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.connect_retries = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.connect_retry_delay = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.debug = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.default_trusted_certificate_ids = [] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.enable_certificate_validation = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.enable_rbd_download = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.endpoint_override = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.insecure = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.keyfile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.max_version = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.min_version = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.num_retries = 3 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.rbd_ceph_conf = {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.rbd_connect_timeout = 5 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.rbd_pool = {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.rbd_user = {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.region_name = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.service_name = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.service_type = image {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.split_loggers = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.status_code_retries = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.status_code_retry_delay = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.timeout = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.verify_glance_signatures = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] glance.version = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] guestfs.debug = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] hyperv.config_drive_cdrom = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] hyperv.config_drive_inject_password = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] hyperv.enable_instance_metrics_collection = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] hyperv.enable_remotefx = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] hyperv.instances_path_share = {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] hyperv.iscsi_initiator_list = [] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] hyperv.limit_cpu_features = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] hyperv.power_state_check_timeframe = 60 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] hyperv.power_state_event_polling_interval = 2 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] hyperv.use_multipath_io = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] hyperv.volume_attach_retry_count = 10 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] hyperv.volume_attach_retry_interval = 5 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] hyperv.vswitch_name = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] mks.enabled = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] image_cache.manager_interval = 2400 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] image_cache.precache_concurrency = 1 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] image_cache.remove_unused_base_images = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] image_cache.subdirectory_name = _base {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ironic.api_max_retries = 60 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ironic.api_retry_interval = 2 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ironic.auth_section = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ironic.auth_type = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ironic.cafile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ironic.certfile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ironic.collect_timing = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ironic.connect_retries = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ironic.connect_retry_delay = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ironic.endpoint_override = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ironic.insecure = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ironic.keyfile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ironic.max_version = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ironic.min_version = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ironic.partition_key = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ironic.peer_list = [] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ironic.region_name = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ironic.serial_console_state_timeout = 10 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ironic.service_name = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ironic.service_type = baremetal {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ironic.split_loggers = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ironic.status_code_retries = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ironic.status_code_retry_delay = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ironic.timeout = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ironic.version = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] key_manager.fixed_key = **** {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] barbican.barbican_api_version = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] barbican.barbican_endpoint = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] barbican.barbican_endpoint_type = public {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] barbican.barbican_region_name = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] barbican.cafile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] barbican.certfile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] barbican.collect_timing = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] barbican.insecure = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] barbican.keyfile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] barbican.number_of_retries = 60 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] barbican.retry_delay = 1 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] barbican.send_service_user_token = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] barbican.split_loggers = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] barbican.timeout = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] barbican.verify_ssl = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] barbican.verify_ssl_path = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] barbican_service_user.auth_section = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] barbican_service_user.auth_type = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] barbican_service_user.cafile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] barbican_service_user.certfile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] barbican_service_user.collect_timing = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] barbican_service_user.insecure = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] barbican_service_user.keyfile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] barbican_service_user.split_loggers = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] barbican_service_user.timeout = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vault.approle_role_id = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vault.approle_secret_id = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vault.cafile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vault.certfile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vault.collect_timing = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vault.insecure = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vault.keyfile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vault.kv_mountpoint = secret {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vault.kv_version = 2 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vault.namespace = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vault.root_token_id = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vault.split_loggers = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vault.ssl_ca_crt_file = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vault.timeout = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vault.use_ssl = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] keystone.cafile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] keystone.certfile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] keystone.collect_timing = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] keystone.connect_retries = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] keystone.connect_retry_delay = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] keystone.endpoint_override = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] keystone.insecure = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] keystone.keyfile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] keystone.max_version = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] keystone.min_version = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] keystone.region_name = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] keystone.service_name = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] keystone.service_type = identity {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] keystone.split_loggers = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] keystone.status_code_retries = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] keystone.status_code_retry_delay = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] keystone.timeout = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] keystone.version = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.connection_uri = {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.cpu_mode = custom {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.cpu_model_extra_flags = [] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: WARNING oslo_config.cfg [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] Deprecated: Option "cpu_model" from group "libvirt" is deprecated. Use option "cpu_models" from group "libvirt". Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.cpu_models = ['Nehalem'] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.cpu_power_governor_high = performance {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.cpu_power_governor_low = powersave {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.cpu_power_management = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.device_detach_attempts = 8 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.device_detach_timeout = 20 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.disk_cachemodes = [] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.disk_prefix = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.enabled_perf_events = [] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.file_backed_memory = 0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.gid_maps = [] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.hw_disk_discard = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.hw_machine_type = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.images_rbd_ceph_conf = {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.images_rbd_glance_store_name = {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.images_rbd_pool = rbd {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.images_type = default {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.images_volume_group = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.inject_key = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.inject_partition = -2 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.inject_password = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.iscsi_iface = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.iser_use_multipath = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.live_migration_bandwidth = 0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.live_migration_completion_timeout = 800 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.live_migration_downtime = 500 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.live_migration_downtime_delay = 75 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.live_migration_downtime_steps = 10 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.live_migration_inbound_addr = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.live_migration_permit_auto_converge = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.live_migration_permit_post_copy = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.live_migration_scheme = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.live_migration_timeout_action = abort {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.live_migration_tunnelled = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: WARNING oslo_config.cfg [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Apr 20 15:58:36 user nova-compute[71605]: live_migration_uri is deprecated for removal in favor of two other options that Apr 20 15:58:36 user nova-compute[71605]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Apr 20 15:58:36 user nova-compute[71605]: and ``live_migration_inbound_addr`` respectively. Apr 20 15:58:36 user nova-compute[71605]: ). Its value may be silently ignored in the future. Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.live_migration_uri = qemu+ssh://stack@%s/system {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.live_migration_with_native_tls = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.max_queues = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.mem_stats_period_seconds = 10 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.nfs_mount_options = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.nfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.num_aoe_discover_tries = 3 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.num_iser_scan_tries = 5 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.num_memory_encrypted_guests = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.num_nvme_discover_tries = 5 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.num_pcie_ports = 0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.num_volume_scan_tries = 5 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.pmem_namespaces = [] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.quobyte_client_cfg = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.quobyte_mount_point_base = /opt/stack/data/nova/mnt {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.rbd_connect_timeout = 5 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.rbd_secret_uuid = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.rbd_user = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.realtime_scheduler_priority = 1 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.remote_filesystem_transport = ssh {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.rescue_image_id = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.rescue_kernel_id = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.rescue_ramdisk_id = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.rng_dev_path = /dev/urandom {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.rx_queue_size = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.smbfs_mount_options = {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.smbfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.snapshot_compression = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.snapshot_image_format = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.sparse_logical_volumes = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.swtpm_enabled = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.swtpm_group = tss {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.swtpm_user = tss {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.sysinfo_serial = unique {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.tx_queue_size = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.uid_maps = [] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.use_virtio_for_bridges = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.virt_type = kvm {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.volume_clear = zero {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.volume_clear_size = 0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.volume_use_multipath = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.vzstorage_cache_path = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.vzstorage_mount_group = qemu {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.vzstorage_mount_opts = [] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/nova/mnt {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.vzstorage_mount_user = stack {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] neutron.auth_section = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] neutron.auth_type = password {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] neutron.cafile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] neutron.certfile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] neutron.collect_timing = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] neutron.connect_retries = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] neutron.connect_retry_delay = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] neutron.default_floating_pool = public {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] neutron.endpoint_override = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] neutron.extension_sync_interval = 600 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] neutron.http_retries = 3 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] neutron.insecure = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] neutron.keyfile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] neutron.max_version = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] neutron.metadata_proxy_shared_secret = **** {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] neutron.min_version = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] neutron.ovs_bridge = br-int {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] neutron.physnets = [] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] neutron.region_name = RegionOne {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] neutron.service_metadata_proxy = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] neutron.service_name = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] neutron.service_type = network {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] neutron.split_loggers = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] neutron.status_code_retries = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] neutron.status_code_retry_delay = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] neutron.timeout = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] neutron.version = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] notifications.bdms_in_notifications = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] notifications.default_level = INFO {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] notifications.notification_format = unversioned {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] notifications.notify_on_state_change = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] pci.alias = [] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] pci.device_spec = [] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] pci.report_in_placement = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.auth_section = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.auth_type = password {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.auth_url = http://10.0.0.210/identity {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.cafile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.certfile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.collect_timing = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.connect_retries = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.connect_retry_delay = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.default_domain_id = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.default_domain_name = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.domain_id = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.domain_name = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.endpoint_override = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.insecure = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.keyfile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.max_version = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.min_version = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.password = **** {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.project_domain_id = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.project_domain_name = Default {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.project_id = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.project_name = service {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.region_name = RegionOne {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.service_name = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.service_type = placement {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.split_loggers = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.status_code_retries = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.status_code_retry_delay = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.system_scope = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.timeout = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.trust_id = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.user_domain_id = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.user_domain_name = Default {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.user_id = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.username = placement {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] placement.version = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] quota.cores = 20 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] quota.count_usage_from_placement = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] quota.injected_file_content_bytes = 10240 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] quota.injected_file_path_length = 255 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] quota.injected_files = 5 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] quota.instances = 10 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] quota.key_pairs = 100 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] quota.metadata_items = 128 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] quota.ram = 51200 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] quota.recheck_quota = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] quota.server_group_members = 10 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] quota.server_groups = 10 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] rdp.enabled = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] scheduler.image_metadata_prefilter = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] scheduler.max_attempts = 3 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] scheduler.max_placement_results = 1000 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] scheduler.query_placement_for_availability_zone = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] scheduler.query_placement_for_image_type_support = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] scheduler.workers = 3 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] filter_scheduler.enabled_filters = ['AvailabilityZoneFilter', 'ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] filter_scheduler.host_subset_size = 1 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] filter_scheduler.image_properties_default_architecture = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] filter_scheduler.isolated_hosts = [] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] filter_scheduler.isolated_images = [] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] filter_scheduler.max_instances_per_host = 50 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] filter_scheduler.pci_in_placement = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] filter_scheduler.track_instance_changes = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] metrics.required = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] metrics.weight_multiplier = 1.0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] metrics.weight_of_unavailable = -10000.0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] metrics.weight_setting = [] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] serial_console.enabled = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] serial_console.port_range = 10000:20000 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] serial_console.serialproxy_port = 6083 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] service_user.auth_section = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] service_user.auth_type = password {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] service_user.cafile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] service_user.certfile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] service_user.collect_timing = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] service_user.insecure = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] service_user.keyfile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] service_user.send_service_user_token = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] service_user.split_loggers = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] service_user.timeout = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] spice.agent_enabled = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] spice.enabled = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] spice.html5proxy_base_url = http://10.0.0.210:6081/spice_auto.html {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] spice.html5proxy_host = 0.0.0.0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] spice.html5proxy_port = 6082 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] spice.image_compression = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] spice.jpeg_compression = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] spice.playback_compression = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] spice.server_listen = 127.0.0.1 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] spice.streaming_mode = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] spice.zlib_compression = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] upgrade_levels.baseapi = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] upgrade_levels.cert = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] upgrade_levels.compute = auto {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] upgrade_levels.conductor = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] upgrade_levels.scheduler = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vendordata_dynamic_auth.auth_section = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vendordata_dynamic_auth.auth_type = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vendordata_dynamic_auth.cafile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vendordata_dynamic_auth.certfile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vendordata_dynamic_auth.collect_timing = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vendordata_dynamic_auth.insecure = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vendordata_dynamic_auth.keyfile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vendordata_dynamic_auth.split_loggers = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vendordata_dynamic_auth.timeout = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vmware.api_retry_count = 10 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vmware.ca_file = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vmware.cache_prefix = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vmware.cluster_name = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vmware.connection_pool_size = 10 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vmware.console_delay_seconds = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vmware.datastore_regex = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vmware.host_ip = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vmware.host_password = **** {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vmware.host_port = 443 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vmware.host_username = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vmware.insecure = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vmware.integration_bridge = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vmware.maximum_objects = 100 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vmware.pbm_default_policy = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vmware.pbm_enabled = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vmware.pbm_wsdl_location = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vmware.serial_port_proxy_uri = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vmware.serial_port_service_uri = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vmware.task_poll_interval = 0.5 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vmware.use_linked_clone = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vmware.vnc_keymap = en-us {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vmware.vnc_port = 5900 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vmware.vnc_port_total = 10000 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vnc.auth_schemes = ['none'] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vnc.enabled = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vnc.novncproxy_base_url = http://10.0.0.210:6080/vnc_lite.html {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vnc.novncproxy_port = 6080 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vnc.server_listen = 0.0.0.0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vnc.server_proxyclient_address = 10.0.0.210 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vnc.vencrypt_ca_certs = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vnc.vencrypt_client_cert = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vnc.vencrypt_client_key = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] workarounds.disable_fallback_pcpu_query = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] workarounds.disable_group_policy_check_upcall = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] workarounds.disable_rootwrap = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] workarounds.enable_numa_live_migration = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] workarounds.handle_virt_lifecycle_events = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] workarounds.libvirt_disable_apic = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] workarounds.never_download_image_if_on_rbd = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] wsgi.client_socket_timeout = 900 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] wsgi.default_pool_size = 1000 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] wsgi.keep_alive = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] wsgi.max_header_line = 16384 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] wsgi.secure_proxy_ssl_header = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] wsgi.ssl_ca_file = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] wsgi.ssl_cert_file = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] wsgi.ssl_key_file = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] wsgi.tcp_keepidle = 600 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] zvm.ca_file = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] zvm.cloud_connector_url = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] zvm.image_tmp_path = /opt/stack/data/nova/images {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] zvm.reachable_timeout = 300 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_policy.enforce_new_defaults = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_policy.enforce_scope = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_policy.policy_default_rule = default {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_policy.policy_file = policy.yaml {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] profiler.connection_string = messaging:// {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] profiler.enabled = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] profiler.es_doc_type = notification {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] profiler.es_scroll_size = 10000 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] profiler.es_scroll_time = 2m {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] profiler.filter_error_trace = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] profiler.hmac_keys = SECRET_KEY {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] profiler.sentinel_service_name = mymaster {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] profiler.socket_timeout = 0.1 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] profiler.trace_sqlalchemy = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] remote_debug.host = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] remote_debug.port = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.rabbit_quroum_max_memory_bytes = 0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.rabbit_quroum_max_memory_length = 0 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.ssl = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_rabbit.ssl_version = {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_notifications.retry = -1 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_messaging_notifications.transport_url = **** {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_limit.auth_section = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_limit.auth_type = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_limit.cafile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_limit.certfile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_limit.collect_timing = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_limit.connect_retries = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_limit.connect_retry_delay = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_limit.endpoint_id = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_limit.endpoint_override = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_limit.insecure = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_limit.keyfile = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_limit.max_version = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_limit.min_version = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_limit.region_name = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_limit.service_name = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_limit.service_type = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_limit.split_loggers = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_limit.status_code_retries = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_limit.status_code_retry_delay = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_limit.timeout = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_limit.valid_interfaces = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_limit.version = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_reports.file_event_handler = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_reports.file_event_handler_interval = 1 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] oslo_reports.log_dir = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vif_plug_linux_bridge_privileged.group = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vif_plug_linux_bridge_privileged.thread_pool_size = 12 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vif_plug_linux_bridge_privileged.user = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vif_plug_ovs_privileged.group = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vif_plug_ovs_privileged.helper_command = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vif_plug_ovs_privileged.thread_pool_size = 12 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] vif_plug_ovs_privileged.user = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] os_vif_linux_bridge.flat_interface = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] os_vif_linux_bridge.vlan_interface = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] os_vif_ovs.isolate_vif = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] os_vif_ovs.ovsdb_interface = native {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] os_vif_ovs.per_port_bridge = False {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] os_brick.lock_path = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] privsep_osbrick.capabilities = [21] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] privsep_osbrick.group = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] privsep_osbrick.helper_command = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] privsep_osbrick.thread_pool_size = 12 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] privsep_osbrick.user = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] nova_sys_admin.group = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] nova_sys_admin.helper_command = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] nova_sys_admin.thread_pool_size = 12 {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] nova_sys_admin.user = None {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG oslo_service.service [None req-215d0104-aa65-4400-9993-0fe034ac69fc None None] ******************************************************************************** {{(pid=71605) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} Apr 20 15:58:36 user nova-compute[71605]: INFO nova.service [-] Starting compute node (version 0.0.0) Apr 20 15:58:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Starting native event thread {{(pid=71605) _init_events /opt/stack/nova/nova/virt/libvirt/host.py:492}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Starting green dispatch thread {{(pid=71605) _init_events /opt/stack/nova/nova/virt/libvirt/host.py:498}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Starting connection event dispatch thread {{(pid=71605) initialize /opt/stack/nova/nova/virt/libvirt/host.py:620}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Connecting to libvirt: qemu:///system {{(pid=71605) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:503}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Registering for lifecycle events {{(pid=71605) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:509}} Apr 20 15:58:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Registering for connection events: {{(pid=71605) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:530}} Apr 20 15:58:36 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Connection event '1' reason 'None' Apr 20 15:58:36 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Cannot update service status on host "user" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host user could not be found. Apr 20 15:58:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.volume.mount [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Initialising _HostMountState generation 0 {{(pid=71605) host_up /opt/stack/nova/nova/virt/libvirt/volume/mount.py:130}} Apr 20 15:58:44 user nova-compute[71605]: INFO nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Libvirt host capabilities Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: e20c3142-5af9-7467-ecd8-70b2e4a210d6 Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: x86_64 Apr 20 15:58:44 user nova-compute[71605]: IvyBridge-IBRS Apr 20 15:58:44 user nova-compute[71605]: Intel Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: tcp Apr 20 15:58:44 user nova-compute[71605]: rdma Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: 8152920 Apr 20 15:58:44 user nova-compute[71605]: 2038230 Apr 20 15:58:44 user nova-compute[71605]: 0 Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: 8255068 Apr 20 15:58:44 user nova-compute[71605]: 2063767 Apr 20 15:58:44 user nova-compute[71605]: 0 Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: apparmor Apr 20 15:58:44 user nova-compute[71605]: 0 Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: dac Apr 20 15:58:44 user nova-compute[71605]: 0 Apr 20 15:58:44 user nova-compute[71605]: +64055:+108 Apr 20 15:58:44 user nova-compute[71605]: +64055:+108 Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: hvm Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: 64 Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-alpha Apr 20 15:58:44 user nova-compute[71605]: clipper Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: hvm Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: 32 Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-arm Apr 20 15:58:44 user nova-compute[71605]: integratorcp Apr 20 15:58:44 user nova-compute[71605]: ast2600-evb Apr 20 15:58:44 user nova-compute[71605]: borzoi Apr 20 15:58:44 user nova-compute[71605]: spitz Apr 20 15:58:44 user nova-compute[71605]: virt-2.7 Apr 20 15:58:44 user nova-compute[71605]: nuri Apr 20 15:58:44 user nova-compute[71605]: mcimx7d-sabre Apr 20 15:58:44 user nova-compute[71605]: romulus-bmc Apr 20 15:58:44 user nova-compute[71605]: virt-3.0 Apr 20 15:58:44 user nova-compute[71605]: virt-5.0 Apr 20 15:58:44 user nova-compute[71605]: npcm750-evb Apr 20 15:58:44 user nova-compute[71605]: virt-2.10 Apr 20 15:58:44 user nova-compute[71605]: rainier-bmc Apr 20 15:58:44 user nova-compute[71605]: mps3-an547 Apr 20 15:58:44 user nova-compute[71605]: musca-b1 Apr 20 15:58:44 user nova-compute[71605]: realview-pbx-a9 Apr 20 15:58:44 user nova-compute[71605]: versatileab Apr 20 15:58:44 user nova-compute[71605]: kzm Apr 20 15:58:44 user nova-compute[71605]: virt-2.8 Apr 20 15:58:44 user nova-compute[71605]: musca-a Apr 20 15:58:44 user nova-compute[71605]: virt-3.1 Apr 20 15:58:44 user nova-compute[71605]: mcimx6ul-evk Apr 20 15:58:44 user nova-compute[71605]: virt-5.1 Apr 20 15:58:44 user nova-compute[71605]: smdkc210 Apr 20 15:58:44 user nova-compute[71605]: sx1 Apr 20 15:58:44 user nova-compute[71605]: virt-2.11 Apr 20 15:58:44 user nova-compute[71605]: imx25-pdk Apr 20 15:58:44 user nova-compute[71605]: stm32vldiscovery Apr 20 15:58:44 user nova-compute[71605]: virt-2.9 Apr 20 15:58:44 user nova-compute[71605]: orangepi-pc Apr 20 15:58:44 user nova-compute[71605]: quanta-q71l-bmc Apr 20 15:58:44 user nova-compute[71605]: z2 Apr 20 15:58:44 user nova-compute[71605]: virt-5.2 Apr 20 15:58:44 user nova-compute[71605]: xilinx-zynq-a9 Apr 20 15:58:44 user nova-compute[71605]: tosa Apr 20 15:58:44 user nova-compute[71605]: mps2-an500 Apr 20 15:58:44 user nova-compute[71605]: virt-2.12 Apr 20 15:58:44 user nova-compute[71605]: mps2-an521 Apr 20 15:58:44 user nova-compute[71605]: sabrelite Apr 20 15:58:44 user nova-compute[71605]: mps2-an511 Apr 20 15:58:44 user nova-compute[71605]: canon-a1100 Apr 20 15:58:44 user nova-compute[71605]: realview-eb Apr 20 15:58:44 user nova-compute[71605]: quanta-gbs-bmc Apr 20 15:58:44 user nova-compute[71605]: emcraft-sf2 Apr 20 15:58:44 user nova-compute[71605]: realview-pb-a8 Apr 20 15:58:44 user nova-compute[71605]: virt-4.0 Apr 20 15:58:44 user nova-compute[71605]: raspi1ap Apr 20 15:58:44 user nova-compute[71605]: palmetto-bmc Apr 20 15:58:44 user nova-compute[71605]: sx1-v1 Apr 20 15:58:44 user nova-compute[71605]: n810 Apr 20 15:58:44 user nova-compute[71605]: g220a-bmc Apr 20 15:58:44 user nova-compute[71605]: n800 Apr 20 15:58:44 user nova-compute[71605]: tacoma-bmc Apr 20 15:58:44 user nova-compute[71605]: virt-4.1 Apr 20 15:58:44 user nova-compute[71605]: quanta-gsj Apr 20 15:58:44 user nova-compute[71605]: versatilepb Apr 20 15:58:44 user nova-compute[71605]: terrier Apr 20 15:58:44 user nova-compute[71605]: mainstone Apr 20 15:58:44 user nova-compute[71605]: realview-eb-mpcore Apr 20 15:58:44 user nova-compute[71605]: supermicrox11-bmc Apr 20 15:58:44 user nova-compute[71605]: virt-4.2 Apr 20 15:58:44 user nova-compute[71605]: witherspoon-bmc Apr 20 15:58:44 user nova-compute[71605]: mps3-an524 Apr 20 15:58:44 user nova-compute[71605]: swift-bmc Apr 20 15:58:44 user nova-compute[71605]: kudo-bmc Apr 20 15:58:44 user nova-compute[71605]: vexpress-a9 Apr 20 15:58:44 user nova-compute[71605]: midway Apr 20 15:58:44 user nova-compute[71605]: musicpal Apr 20 15:58:44 user nova-compute[71605]: lm3s811evb Apr 20 15:58:44 user nova-compute[71605]: lm3s6965evb Apr 20 15:58:44 user nova-compute[71605]: microbit Apr 20 15:58:44 user nova-compute[71605]: mps2-an505 Apr 20 15:58:44 user nova-compute[71605]: mps2-an385 Apr 20 15:58:44 user nova-compute[71605]: virt-6.0 Apr 20 15:58:44 user nova-compute[71605]: cubieboard Apr 20 15:58:44 user nova-compute[71605]: verdex Apr 20 15:58:44 user nova-compute[71605]: netduino2 Apr 20 15:58:44 user nova-compute[71605]: mps2-an386 Apr 20 15:58:44 user nova-compute[71605]: virt-6.1 Apr 20 15:58:44 user nova-compute[71605]: raspi2b Apr 20 15:58:44 user nova-compute[71605]: vexpress-a15 Apr 20 15:58:44 user nova-compute[71605]: fuji-bmc Apr 20 15:58:44 user nova-compute[71605]: virt-6.2 Apr 20 15:58:44 user nova-compute[71605]: virt Apr 20 15:58:44 user nova-compute[71605]: sonorapass-bmc Apr 20 15:58:44 user nova-compute[71605]: cheetah Apr 20 15:58:44 user nova-compute[71605]: virt-2.6 Apr 20 15:58:44 user nova-compute[71605]: ast2500-evb Apr 20 15:58:44 user nova-compute[71605]: highbank Apr 20 15:58:44 user nova-compute[71605]: akita Apr 20 15:58:44 user nova-compute[71605]: connex Apr 20 15:58:44 user nova-compute[71605]: netduinoplus2 Apr 20 15:58:44 user nova-compute[71605]: collie Apr 20 15:58:44 user nova-compute[71605]: raspi0 Apr 20 15:58:44 user nova-compute[71605]: fp5280g2-bmc Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: hvm Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: 32 Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-arm Apr 20 15:58:44 user nova-compute[71605]: integratorcp Apr 20 15:58:44 user nova-compute[71605]: ast2600-evb Apr 20 15:58:44 user nova-compute[71605]: borzoi Apr 20 15:58:44 user nova-compute[71605]: spitz Apr 20 15:58:44 user nova-compute[71605]: virt-2.7 Apr 20 15:58:44 user nova-compute[71605]: nuri Apr 20 15:58:44 user nova-compute[71605]: mcimx7d-sabre Apr 20 15:58:44 user nova-compute[71605]: romulus-bmc Apr 20 15:58:44 user nova-compute[71605]: virt-3.0 Apr 20 15:58:44 user nova-compute[71605]: virt-5.0 Apr 20 15:58:44 user nova-compute[71605]: npcm750-evb Apr 20 15:58:44 user nova-compute[71605]: virt-2.10 Apr 20 15:58:44 user nova-compute[71605]: rainier-bmc Apr 20 15:58:44 user nova-compute[71605]: mps3-an547 Apr 20 15:58:44 user nova-compute[71605]: musca-b1 Apr 20 15:58:44 user nova-compute[71605]: realview-pbx-a9 Apr 20 15:58:44 user nova-compute[71605]: versatileab Apr 20 15:58:44 user nova-compute[71605]: kzm Apr 20 15:58:44 user nova-compute[71605]: virt-2.8 Apr 20 15:58:44 user nova-compute[71605]: musca-a Apr 20 15:58:44 user nova-compute[71605]: virt-3.1 Apr 20 15:58:44 user nova-compute[71605]: mcimx6ul-evk Apr 20 15:58:44 user nova-compute[71605]: virt-5.1 Apr 20 15:58:44 user nova-compute[71605]: smdkc210 Apr 20 15:58:44 user nova-compute[71605]: sx1 Apr 20 15:58:44 user nova-compute[71605]: virt-2.11 Apr 20 15:58:44 user nova-compute[71605]: imx25-pdk Apr 20 15:58:44 user nova-compute[71605]: stm32vldiscovery Apr 20 15:58:44 user nova-compute[71605]: virt-2.9 Apr 20 15:58:44 user nova-compute[71605]: orangepi-pc Apr 20 15:58:44 user nova-compute[71605]: quanta-q71l-bmc Apr 20 15:58:44 user nova-compute[71605]: z2 Apr 20 15:58:44 user nova-compute[71605]: virt-5.2 Apr 20 15:58:44 user nova-compute[71605]: xilinx-zynq-a9 Apr 20 15:58:44 user nova-compute[71605]: tosa Apr 20 15:58:44 user nova-compute[71605]: mps2-an500 Apr 20 15:58:44 user nova-compute[71605]: virt-2.12 Apr 20 15:58:44 user nova-compute[71605]: mps2-an521 Apr 20 15:58:44 user nova-compute[71605]: sabrelite Apr 20 15:58:44 user nova-compute[71605]: mps2-an511 Apr 20 15:58:44 user nova-compute[71605]: canon-a1100 Apr 20 15:58:44 user nova-compute[71605]: realview-eb Apr 20 15:58:44 user nova-compute[71605]: quanta-gbs-bmc Apr 20 15:58:44 user nova-compute[71605]: emcraft-sf2 Apr 20 15:58:44 user nova-compute[71605]: realview-pb-a8 Apr 20 15:58:44 user nova-compute[71605]: virt-4.0 Apr 20 15:58:44 user nova-compute[71605]: raspi1ap Apr 20 15:58:44 user nova-compute[71605]: palmetto-bmc Apr 20 15:58:44 user nova-compute[71605]: sx1-v1 Apr 20 15:58:44 user nova-compute[71605]: n810 Apr 20 15:58:44 user nova-compute[71605]: g220a-bmc Apr 20 15:58:44 user nova-compute[71605]: n800 Apr 20 15:58:44 user nova-compute[71605]: tacoma-bmc Apr 20 15:58:44 user nova-compute[71605]: virt-4.1 Apr 20 15:58:44 user nova-compute[71605]: quanta-gsj Apr 20 15:58:44 user nova-compute[71605]: versatilepb Apr 20 15:58:44 user nova-compute[71605]: terrier Apr 20 15:58:44 user nova-compute[71605]: mainstone Apr 20 15:58:44 user nova-compute[71605]: realview-eb-mpcore Apr 20 15:58:44 user nova-compute[71605]: supermicrox11-bmc Apr 20 15:58:44 user nova-compute[71605]: virt-4.2 Apr 20 15:58:44 user nova-compute[71605]: witherspoon-bmc Apr 20 15:58:44 user nova-compute[71605]: mps3-an524 Apr 20 15:58:44 user nova-compute[71605]: swift-bmc Apr 20 15:58:44 user nova-compute[71605]: kudo-bmc Apr 20 15:58:44 user nova-compute[71605]: vexpress-a9 Apr 20 15:58:44 user nova-compute[71605]: midway Apr 20 15:58:44 user nova-compute[71605]: musicpal Apr 20 15:58:44 user nova-compute[71605]: lm3s811evb Apr 20 15:58:44 user nova-compute[71605]: lm3s6965evb Apr 20 15:58:44 user nova-compute[71605]: microbit Apr 20 15:58:44 user nova-compute[71605]: mps2-an505 Apr 20 15:58:44 user nova-compute[71605]: mps2-an385 Apr 20 15:58:44 user nova-compute[71605]: virt-6.0 Apr 20 15:58:44 user nova-compute[71605]: cubieboard Apr 20 15:58:44 user nova-compute[71605]: verdex Apr 20 15:58:44 user nova-compute[71605]: netduino2 Apr 20 15:58:44 user nova-compute[71605]: mps2-an386 Apr 20 15:58:44 user nova-compute[71605]: virt-6.1 Apr 20 15:58:44 user nova-compute[71605]: raspi2b Apr 20 15:58:44 user nova-compute[71605]: vexpress-a15 Apr 20 15:58:44 user nova-compute[71605]: fuji-bmc Apr 20 15:58:44 user nova-compute[71605]: virt-6.2 Apr 20 15:58:44 user nova-compute[71605]: virt Apr 20 15:58:44 user nova-compute[71605]: sonorapass-bmc Apr 20 15:58:44 user nova-compute[71605]: cheetah Apr 20 15:58:44 user nova-compute[71605]: virt-2.6 Apr 20 15:58:44 user nova-compute[71605]: ast2500-evb Apr 20 15:58:44 user nova-compute[71605]: highbank Apr 20 15:58:44 user nova-compute[71605]: akita Apr 20 15:58:44 user nova-compute[71605]: connex Apr 20 15:58:44 user nova-compute[71605]: netduinoplus2 Apr 20 15:58:44 user nova-compute[71605]: collie Apr 20 15:58:44 user nova-compute[71605]: raspi0 Apr 20 15:58:44 user nova-compute[71605]: fp5280g2-bmc Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: hvm Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: 64 Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-aarch64 Apr 20 15:58:44 user nova-compute[71605]: integratorcp Apr 20 15:58:44 user nova-compute[71605]: ast2600-evb Apr 20 15:58:44 user nova-compute[71605]: borzoi Apr 20 15:58:44 user nova-compute[71605]: spitz Apr 20 15:58:44 user nova-compute[71605]: virt-2.7 Apr 20 15:58:44 user nova-compute[71605]: nuri Apr 20 15:58:44 user nova-compute[71605]: mcimx7d-sabre Apr 20 15:58:44 user nova-compute[71605]: romulus-bmc Apr 20 15:58:44 user nova-compute[71605]: virt-3.0 Apr 20 15:58:44 user nova-compute[71605]: virt-5.0 Apr 20 15:58:44 user nova-compute[71605]: npcm750-evb Apr 20 15:58:44 user nova-compute[71605]: virt-2.10 Apr 20 15:58:44 user nova-compute[71605]: rainier-bmc Apr 20 15:58:44 user nova-compute[71605]: mps3-an547 Apr 20 15:58:44 user nova-compute[71605]: virt-2.8 Apr 20 15:58:44 user nova-compute[71605]: musca-b1 Apr 20 15:58:44 user nova-compute[71605]: realview-pbx-a9 Apr 20 15:58:44 user nova-compute[71605]: versatileab Apr 20 15:58:44 user nova-compute[71605]: kzm Apr 20 15:58:44 user nova-compute[71605]: musca-a Apr 20 15:58:44 user nova-compute[71605]: virt-3.1 Apr 20 15:58:44 user nova-compute[71605]: mcimx6ul-evk Apr 20 15:58:44 user nova-compute[71605]: virt-5.1 Apr 20 15:58:44 user nova-compute[71605]: smdkc210 Apr 20 15:58:44 user nova-compute[71605]: sx1 Apr 20 15:58:44 user nova-compute[71605]: virt-2.11 Apr 20 15:58:44 user nova-compute[71605]: imx25-pdk Apr 20 15:58:44 user nova-compute[71605]: stm32vldiscovery Apr 20 15:58:44 user nova-compute[71605]: virt-2.9 Apr 20 15:58:44 user nova-compute[71605]: orangepi-pc Apr 20 15:58:44 user nova-compute[71605]: quanta-q71l-bmc Apr 20 15:58:44 user nova-compute[71605]: z2 Apr 20 15:58:44 user nova-compute[71605]: virt-5.2 Apr 20 15:58:44 user nova-compute[71605]: xilinx-zynq-a9 Apr 20 15:58:44 user nova-compute[71605]: xlnx-zcu102 Apr 20 15:58:44 user nova-compute[71605]: tosa Apr 20 15:58:44 user nova-compute[71605]: mps2-an500 Apr 20 15:58:44 user nova-compute[71605]: virt-2.12 Apr 20 15:58:44 user nova-compute[71605]: mps2-an521 Apr 20 15:58:44 user nova-compute[71605]: sabrelite Apr 20 15:58:44 user nova-compute[71605]: mps2-an511 Apr 20 15:58:44 user nova-compute[71605]: canon-a1100 Apr 20 15:58:44 user nova-compute[71605]: realview-eb Apr 20 15:58:44 user nova-compute[71605]: quanta-gbs-bmc Apr 20 15:58:44 user nova-compute[71605]: emcraft-sf2 Apr 20 15:58:44 user nova-compute[71605]: realview-pb-a8 Apr 20 15:58:44 user nova-compute[71605]: sbsa-ref Apr 20 15:58:44 user nova-compute[71605]: virt-4.0 Apr 20 15:58:44 user nova-compute[71605]: raspi1ap Apr 20 15:58:44 user nova-compute[71605]: palmetto-bmc Apr 20 15:58:44 user nova-compute[71605]: sx1-v1 Apr 20 15:58:44 user nova-compute[71605]: n810 Apr 20 15:58:44 user nova-compute[71605]: g220a-bmc Apr 20 15:58:44 user nova-compute[71605]: n800 Apr 20 15:58:44 user nova-compute[71605]: tacoma-bmc Apr 20 15:58:44 user nova-compute[71605]: virt-4.1 Apr 20 15:58:44 user nova-compute[71605]: quanta-gsj Apr 20 15:58:44 user nova-compute[71605]: versatilepb Apr 20 15:58:44 user nova-compute[71605]: terrier Apr 20 15:58:44 user nova-compute[71605]: mainstone Apr 20 15:58:44 user nova-compute[71605]: realview-eb-mpcore Apr 20 15:58:44 user nova-compute[71605]: supermicrox11-bmc Apr 20 15:58:44 user nova-compute[71605]: virt-4.2 Apr 20 15:58:44 user nova-compute[71605]: witherspoon-bmc Apr 20 15:58:44 user nova-compute[71605]: mps3-an524 Apr 20 15:58:44 user nova-compute[71605]: swift-bmc Apr 20 15:58:44 user nova-compute[71605]: kudo-bmc Apr 20 15:58:44 user nova-compute[71605]: vexpress-a9 Apr 20 15:58:44 user nova-compute[71605]: midway Apr 20 15:58:44 user nova-compute[71605]: musicpal Apr 20 15:58:44 user nova-compute[71605]: lm3s811evb Apr 20 15:58:44 user nova-compute[71605]: lm3s6965evb Apr 20 15:58:44 user nova-compute[71605]: microbit Apr 20 15:58:44 user nova-compute[71605]: mps2-an505 Apr 20 15:58:44 user nova-compute[71605]: mps2-an385 Apr 20 15:58:44 user nova-compute[71605]: virt-6.0 Apr 20 15:58:44 user nova-compute[71605]: raspi3ap Apr 20 15:58:44 user nova-compute[71605]: cubieboard Apr 20 15:58:44 user nova-compute[71605]: verdex Apr 20 15:58:44 user nova-compute[71605]: netduino2 Apr 20 15:58:44 user nova-compute[71605]: xlnx-versal-virt Apr 20 15:58:44 user nova-compute[71605]: mps2-an386 Apr 20 15:58:44 user nova-compute[71605]: virt-6.1 Apr 20 15:58:44 user nova-compute[71605]: raspi3b Apr 20 15:58:44 user nova-compute[71605]: raspi2b Apr 20 15:58:44 user nova-compute[71605]: vexpress-a15 Apr 20 15:58:44 user nova-compute[71605]: fuji-bmc Apr 20 15:58:44 user nova-compute[71605]: virt-6.2 Apr 20 15:58:44 user nova-compute[71605]: virt Apr 20 15:58:44 user nova-compute[71605]: sonorapass-bmc Apr 20 15:58:44 user nova-compute[71605]: cheetah Apr 20 15:58:44 user nova-compute[71605]: virt-2.6 Apr 20 15:58:44 user nova-compute[71605]: ast2500-evb Apr 20 15:58:44 user nova-compute[71605]: highbank Apr 20 15:58:44 user nova-compute[71605]: akita Apr 20 15:58:44 user nova-compute[71605]: connex Apr 20 15:58:44 user nova-compute[71605]: netduinoplus2 Apr 20 15:58:44 user nova-compute[71605]: collie Apr 20 15:58:44 user nova-compute[71605]: raspi0 Apr 20 15:58:44 user nova-compute[71605]: fp5280g2-bmc Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: hvm Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: 32 Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-cris Apr 20 15:58:44 user nova-compute[71605]: axis-dev88 Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: hvm Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: 32 Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-i386 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-jammy Apr 20 15:58:44 user nova-compute[71605]: ubuntu Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-impish-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-q35-5.2 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-2.12 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-2.0 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-xenial Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-6.2 Apr 20 15:58:44 user nova-compute[71605]: pc Apr 20 15:58:44 user nova-compute[71605]: pc-q35-4.2 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-2.5 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-4.2 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-focal Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-hirsute Apr 20 15:58:44 user nova-compute[71605]: pc-q35-xenial Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-jammy-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-5.2 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-1.5 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-2.7 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-eoan-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-zesty Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-disco-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-q35-groovy Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-groovy Apr 20 15:58:44 user nova-compute[71605]: pc-q35-artful Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-2.2 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-trusty Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-eoan-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-q35-focal-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-q35-bionic-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-artful Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-2.7 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-6.1 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-yakkety Apr 20 15:58:44 user nova-compute[71605]: pc-q35-2.4 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-cosmic-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-q35-2.10 Apr 20 15:58:44 user nova-compute[71605]: x-remote Apr 20 15:58:44 user nova-compute[71605]: pc-q35-5.1 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-1.7 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-2.9 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-2.11 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-3.1 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-6.1 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-4.1 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-jammy Apr 20 15:58:44 user nova-compute[71605]: ubuntu-q35 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-2.4 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-4.1 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-eoan Apr 20 15:58:44 user nova-compute[71605]: pc-q35-jammy-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-5.1 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-2.9 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-bionic-hpb Apr 20 15:58:44 user nova-compute[71605]: isapc Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-1.4 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-cosmic Apr 20 15:58:44 user nova-compute[71605]: pc-q35-2.6 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-3.1 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-bionic Apr 20 15:58:44 user nova-compute[71605]: pc-q35-disco-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-cosmic Apr 20 15:58:44 user nova-compute[71605]: pc-q35-2.12 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-bionic Apr 20 15:58:44 user nova-compute[71605]: pc-q35-groovy-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-q35-disco Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-cosmic-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-2.1 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-wily Apr 20 15:58:44 user nova-compute[71605]: pc-q35-impish Apr 20 15:58:44 user nova-compute[71605]: pc-q35-6.0 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-impish Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-2.6 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-impish-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-q35-hirsute Apr 20 15:58:44 user nova-compute[71605]: pc-q35-4.0.1 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-hirsute-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-1.6 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-5.0 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-2.8 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-2.10 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-3.0 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-6.0 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-zesty Apr 20 15:58:44 user nova-compute[71605]: pc-q35-4.0 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-focal Apr 20 15:58:44 user nova-compute[71605]: microvm Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-2.3 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-focal-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-disco Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-4.0 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-groovy-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-hirsute-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-5.0 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-6.2 Apr 20 15:58:44 user nova-compute[71605]: q35 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-2.8 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-eoan Apr 20 15:58:44 user nova-compute[71605]: pc-q35-2.5 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-3.0 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-yakkety Apr 20 15:58:44 user nova-compute[71605]: pc-q35-2.11 Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: hvm Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: 32 Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-m68k Apr 20 15:58:44 user nova-compute[71605]: mcf5208evb Apr 20 15:58:44 user nova-compute[71605]: an5206 Apr 20 15:58:44 user nova-compute[71605]: virt-6.0 Apr 20 15:58:44 user nova-compute[71605]: q800 Apr 20 15:58:44 user nova-compute[71605]: virt-6.2 Apr 20 15:58:44 user nova-compute[71605]: virt Apr 20 15:58:44 user nova-compute[71605]: next-cube Apr 20 15:58:44 user nova-compute[71605]: virt-6.1 Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: hvm Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: 32 Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-microblaze Apr 20 15:58:44 user nova-compute[71605]: petalogix-s3adsp1800 Apr 20 15:58:44 user nova-compute[71605]: petalogix-ml605 Apr 20 15:58:44 user nova-compute[71605]: xlnx-zynqmp-pmu Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: hvm Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: 32 Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-microblazeel Apr 20 15:58:44 user nova-compute[71605]: petalogix-s3adsp1800 Apr 20 15:58:44 user nova-compute[71605]: petalogix-ml605 Apr 20 15:58:44 user nova-compute[71605]: xlnx-zynqmp-pmu Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: hvm Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: 32 Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-mips Apr 20 15:58:44 user nova-compute[71605]: malta Apr 20 15:58:44 user nova-compute[71605]: mipssim Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: hvm Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: 32 Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-mipsel Apr 20 15:58:44 user nova-compute[71605]: malta Apr 20 15:58:44 user nova-compute[71605]: mipssim Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: hvm Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: 64 Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-mips64 Apr 20 15:58:44 user nova-compute[71605]: malta Apr 20 15:58:44 user nova-compute[71605]: mipssim Apr 20 15:58:44 user nova-compute[71605]: pica61 Apr 20 15:58:44 user nova-compute[71605]: magnum Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: hvm Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: 64 Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-mips64el Apr 20 15:58:44 user nova-compute[71605]: malta Apr 20 15:58:44 user nova-compute[71605]: loongson3-virt Apr 20 15:58:44 user nova-compute[71605]: mipssim Apr 20 15:58:44 user nova-compute[71605]: pica61 Apr 20 15:58:44 user nova-compute[71605]: magnum Apr 20 15:58:44 user nova-compute[71605]: boston Apr 20 15:58:44 user nova-compute[71605]: fuloong2e Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: hvm Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: 32 Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-ppc Apr 20 15:58:44 user nova-compute[71605]: g3beige Apr 20 15:58:44 user nova-compute[71605]: virtex-ml507 Apr 20 15:58:44 user nova-compute[71605]: mac99 Apr 20 15:58:44 user nova-compute[71605]: ppce500 Apr 20 15:58:44 user nova-compute[71605]: pegasos2 Apr 20 15:58:44 user nova-compute[71605]: sam460ex Apr 20 15:58:44 user nova-compute[71605]: bamboo Apr 20 15:58:44 user nova-compute[71605]: 40p Apr 20 15:58:44 user nova-compute[71605]: ref405ep Apr 20 15:58:44 user nova-compute[71605]: mpc8544ds Apr 20 15:58:44 user nova-compute[71605]: taihu Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: hvm Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: 64 Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-ppc64 Apr 20 15:58:44 user nova-compute[71605]: pseries-jammy Apr 20 15:58:44 user nova-compute[71605]: pseries Apr 20 15:58:44 user nova-compute[71605]: powernv9 Apr 20 15:58:44 user nova-compute[71605]: powernv Apr 20 15:58:44 user nova-compute[71605]: taihu Apr 20 15:58:44 user nova-compute[71605]: pseries-4.1 Apr 20 15:58:44 user nova-compute[71605]: mpc8544ds Apr 20 15:58:44 user nova-compute[71605]: pseries-6.1 Apr 20 15:58:44 user nova-compute[71605]: pseries-2.5 Apr 20 15:58:44 user nova-compute[71605]: powernv10 Apr 20 15:58:44 user nova-compute[71605]: pseries-xenial Apr 20 15:58:44 user nova-compute[71605]: pseries-4.2 Apr 20 15:58:44 user nova-compute[71605]: pseries-6.2 Apr 20 15:58:44 user nova-compute[71605]: pseries-yakkety Apr 20 15:58:44 user nova-compute[71605]: pseries-2.6 Apr 20 15:58:44 user nova-compute[71605]: ppce500 Apr 20 15:58:44 user nova-compute[71605]: pseries-bionic-sxxm Apr 20 15:58:44 user nova-compute[71605]: pseries-2.7 Apr 20 15:58:44 user nova-compute[71605]: pseries-3.0 Apr 20 15:58:44 user nova-compute[71605]: pseries-5.0 Apr 20 15:58:44 user nova-compute[71605]: 40p Apr 20 15:58:44 user nova-compute[71605]: pseries-2.8 Apr 20 15:58:44 user nova-compute[71605]: pegasos2 Apr 20 15:58:44 user nova-compute[71605]: pseries-hirsute Apr 20 15:58:44 user nova-compute[71605]: pseries-3.1 Apr 20 15:58:44 user nova-compute[71605]: pseries-5.1 Apr 20 15:58:44 user nova-compute[71605]: pseries-eoan Apr 20 15:58:44 user nova-compute[71605]: pseries-2.9 Apr 20 15:58:44 user nova-compute[71605]: pseries-zesty Apr 20 15:58:44 user nova-compute[71605]: bamboo Apr 20 15:58:44 user nova-compute[71605]: pseries-groovy Apr 20 15:58:44 user nova-compute[71605]: pseries-focal Apr 20 15:58:44 user nova-compute[71605]: g3beige Apr 20 15:58:44 user nova-compute[71605]: pseries-5.2 Apr 20 15:58:44 user nova-compute[71605]: pseries-disco Apr 20 15:58:44 user nova-compute[71605]: pseries-2.12-sxxm Apr 20 15:58:44 user nova-compute[71605]: pseries-2.10 Apr 20 15:58:44 user nova-compute[71605]: virtex-ml507 Apr 20 15:58:44 user nova-compute[71605]: pseries-2.11 Apr 20 15:58:44 user nova-compute[71605]: pseries-2.1 Apr 20 15:58:44 user nova-compute[71605]: pseries-cosmic Apr 20 15:58:44 user nova-compute[71605]: pseries-bionic Apr 20 15:58:44 user nova-compute[71605]: pseries-2.12 Apr 20 15:58:44 user nova-compute[71605]: pseries-2.2 Apr 20 15:58:44 user nova-compute[71605]: mac99 Apr 20 15:58:44 user nova-compute[71605]: pseries-impish Apr 20 15:58:44 user nova-compute[71605]: pseries-artful Apr 20 15:58:44 user nova-compute[71605]: sam460ex Apr 20 15:58:44 user nova-compute[71605]: ref405ep Apr 20 15:58:44 user nova-compute[71605]: pseries-2.3 Apr 20 15:58:44 user nova-compute[71605]: powernv8 Apr 20 15:58:44 user nova-compute[71605]: pseries-4.0 Apr 20 15:58:44 user nova-compute[71605]: pseries-6.0 Apr 20 15:58:44 user nova-compute[71605]: pseries-2.4 Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: hvm Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: 64 Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-ppc64le Apr 20 15:58:44 user nova-compute[71605]: pseries-jammy Apr 20 15:58:44 user nova-compute[71605]: pseries Apr 20 15:58:44 user nova-compute[71605]: powernv9 Apr 20 15:58:44 user nova-compute[71605]: powernv Apr 20 15:58:44 user nova-compute[71605]: taihu Apr 20 15:58:44 user nova-compute[71605]: pseries-4.1 Apr 20 15:58:44 user nova-compute[71605]: mpc8544ds Apr 20 15:58:44 user nova-compute[71605]: pseries-6.1 Apr 20 15:58:44 user nova-compute[71605]: pseries-2.5 Apr 20 15:58:44 user nova-compute[71605]: powernv10 Apr 20 15:58:44 user nova-compute[71605]: pseries-xenial Apr 20 15:58:44 user nova-compute[71605]: pseries-4.2 Apr 20 15:58:44 user nova-compute[71605]: pseries-6.2 Apr 20 15:58:44 user nova-compute[71605]: pseries-yakkety Apr 20 15:58:44 user nova-compute[71605]: pseries-2.6 Apr 20 15:58:44 user nova-compute[71605]: ppce500 Apr 20 15:58:44 user nova-compute[71605]: pseries-bionic-sxxm Apr 20 15:58:44 user nova-compute[71605]: pseries-2.7 Apr 20 15:58:44 user nova-compute[71605]: pseries-3.0 Apr 20 15:58:44 user nova-compute[71605]: pseries-5.0 Apr 20 15:58:44 user nova-compute[71605]: 40p Apr 20 15:58:44 user nova-compute[71605]: pseries-2.8 Apr 20 15:58:44 user nova-compute[71605]: pegasos2 Apr 20 15:58:44 user nova-compute[71605]: pseries-hirsute Apr 20 15:58:44 user nova-compute[71605]: pseries-3.1 Apr 20 15:58:44 user nova-compute[71605]: pseries-5.1 Apr 20 15:58:44 user nova-compute[71605]: pseries-eoan Apr 20 15:58:44 user nova-compute[71605]: pseries-2.9 Apr 20 15:58:44 user nova-compute[71605]: pseries-zesty Apr 20 15:58:44 user nova-compute[71605]: bamboo Apr 20 15:58:44 user nova-compute[71605]: pseries-groovy Apr 20 15:58:44 user nova-compute[71605]: pseries-focal Apr 20 15:58:44 user nova-compute[71605]: g3beige Apr 20 15:58:44 user nova-compute[71605]: pseries-5.2 Apr 20 15:58:44 user nova-compute[71605]: pseries-disco Apr 20 15:58:44 user nova-compute[71605]: pseries-2.12-sxxm Apr 20 15:58:44 user nova-compute[71605]: pseries-2.10 Apr 20 15:58:44 user nova-compute[71605]: virtex-ml507 Apr 20 15:58:44 user nova-compute[71605]: pseries-2.11 Apr 20 15:58:44 user nova-compute[71605]: pseries-2.1 Apr 20 15:58:44 user nova-compute[71605]: pseries-cosmic Apr 20 15:58:44 user nova-compute[71605]: pseries-bionic Apr 20 15:58:44 user nova-compute[71605]: pseries-2.12 Apr 20 15:58:44 user nova-compute[71605]: pseries-2.2 Apr 20 15:58:44 user nova-compute[71605]: mac99 Apr 20 15:58:44 user nova-compute[71605]: pseries-impish Apr 20 15:58:44 user nova-compute[71605]: pseries-artful Apr 20 15:58:44 user nova-compute[71605]: sam460ex Apr 20 15:58:44 user nova-compute[71605]: ref405ep Apr 20 15:58:44 user nova-compute[71605]: pseries-2.3 Apr 20 15:58:44 user nova-compute[71605]: powernv8 Apr 20 15:58:44 user nova-compute[71605]: pseries-4.0 Apr 20 15:58:44 user nova-compute[71605]: pseries-6.0 Apr 20 15:58:44 user nova-compute[71605]: pseries-2.4 Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: hvm Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: 32 Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-riscv32 Apr 20 15:58:44 user nova-compute[71605]: spike Apr 20 15:58:44 user nova-compute[71605]: opentitan Apr 20 15:58:44 user nova-compute[71605]: sifive_u Apr 20 15:58:44 user nova-compute[71605]: sifive_e Apr 20 15:58:44 user nova-compute[71605]: virt Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: hvm Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: 64 Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-riscv64 Apr 20 15:58:44 user nova-compute[71605]: spike Apr 20 15:58:44 user nova-compute[71605]: microchip-icicle-kit Apr 20 15:58:44 user nova-compute[71605]: sifive_u Apr 20 15:58:44 user nova-compute[71605]: shakti_c Apr 20 15:58:44 user nova-compute[71605]: sifive_e Apr 20 15:58:44 user nova-compute[71605]: virt Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: hvm Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: 64 Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-s390x Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-jammy Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-4.0 Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-5.2 Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-artful Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-3.1 Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-groovy Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-hirsute Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-disco Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-2.12 Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-2.6 Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-yakkety Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-eoan Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-2.9 Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-6.0 Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-5.1 Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-3.0 Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-4.2 Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-2.5 Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-2.11 Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-xenial Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-focal Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-2.8 Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-impish Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-bionic Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-5.0 Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-6.2 Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-zesty Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-4.1 Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-cosmic Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-2.4 Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-2.10 Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-2.7 Apr 20 15:58:44 user nova-compute[71605]: s390-ccw-virtio-6.1 Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: hvm Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: 32 Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-sh4 Apr 20 15:58:44 user nova-compute[71605]: shix Apr 20 15:58:44 user nova-compute[71605]: r2d Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: hvm Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: 64 Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-sh4eb Apr 20 15:58:44 user nova-compute[71605]: shix Apr 20 15:58:44 user nova-compute[71605]: r2d Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: hvm Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: 32 Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-sparc Apr 20 15:58:44 user nova-compute[71605]: SS-5 Apr 20 15:58:44 user nova-compute[71605]: SS-20 Apr 20 15:58:44 user nova-compute[71605]: LX Apr 20 15:58:44 user nova-compute[71605]: SPARCClassic Apr 20 15:58:44 user nova-compute[71605]: leon3_generic Apr 20 15:58:44 user nova-compute[71605]: SPARCbook Apr 20 15:58:44 user nova-compute[71605]: SS-4 Apr 20 15:58:44 user nova-compute[71605]: SS-600MP Apr 20 15:58:44 user nova-compute[71605]: SS-10 Apr 20 15:58:44 user nova-compute[71605]: Voyager Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: hvm Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: 64 Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-sparc64 Apr 20 15:58:44 user nova-compute[71605]: sun4u Apr 20 15:58:44 user nova-compute[71605]: niagara Apr 20 15:58:44 user nova-compute[71605]: sun4v Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: hvm Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: 64 Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-x86_64 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-jammy Apr 20 15:58:44 user nova-compute[71605]: ubuntu Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-impish-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-q35-5.2 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-2.12 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-2.0 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-xenial Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-6.2 Apr 20 15:58:44 user nova-compute[71605]: pc Apr 20 15:58:44 user nova-compute[71605]: pc-q35-4.2 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-2.5 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-4.2 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-hirsute Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-focal Apr 20 15:58:44 user nova-compute[71605]: pc-q35-xenial Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-jammy-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-5.2 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-1.5 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-2.7 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-eoan-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-zesty Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-disco-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-q35-groovy Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-groovy Apr 20 15:58:44 user nova-compute[71605]: pc-q35-artful Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-trusty Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-2.2 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-focal-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-eoan-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-q35-bionic-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-artful Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-2.7 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-6.1 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-yakkety Apr 20 15:58:44 user nova-compute[71605]: pc-q35-2.4 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-cosmic-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-q35-2.10 Apr 20 15:58:44 user nova-compute[71605]: x-remote Apr 20 15:58:44 user nova-compute[71605]: pc-q35-5.1 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-1.7 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-2.9 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-2.11 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-3.1 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-6.1 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-4.1 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-jammy Apr 20 15:58:44 user nova-compute[71605]: ubuntu-q35 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-2.4 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-4.1 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-eoan Apr 20 15:58:44 user nova-compute[71605]: pc-q35-jammy-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-5.1 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-2.9 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-bionic-hpb Apr 20 15:58:44 user nova-compute[71605]: isapc Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-1.4 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-cosmic Apr 20 15:58:44 user nova-compute[71605]: pc-q35-2.6 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-3.1 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-bionic Apr 20 15:58:44 user nova-compute[71605]: pc-q35-disco-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-cosmic Apr 20 15:58:44 user nova-compute[71605]: pc-q35-2.12 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-bionic Apr 20 15:58:44 user nova-compute[71605]: pc-q35-groovy-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-q35-disco Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-cosmic-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-2.1 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-wily Apr 20 15:58:44 user nova-compute[71605]: pc-q35-impish Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-2.6 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-6.0 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-impish Apr 20 15:58:44 user nova-compute[71605]: pc-q35-impish-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-q35-hirsute Apr 20 15:58:44 user nova-compute[71605]: pc-q35-4.0.1 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-hirsute-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-1.6 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-5.0 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-2.8 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-2.10 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-3.0 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-zesty Apr 20 15:58:44 user nova-compute[71605]: pc-q35-4.0 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-focal Apr 20 15:58:44 user nova-compute[71605]: microvm Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-6.0 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-2.3 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-disco Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-focal-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-4.0 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-groovy-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-hirsute-hpb Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-5.0 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-2.8 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-6.2 Apr 20 15:58:44 user nova-compute[71605]: q35 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-eoan Apr 20 15:58:44 user nova-compute[71605]: pc-q35-2.5 Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-3.0 Apr 20 15:58:44 user nova-compute[71605]: pc-q35-yakkety Apr 20 15:58:44 user nova-compute[71605]: pc-q35-2.11 Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: hvm Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: 32 Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-xtensa Apr 20 15:58:44 user nova-compute[71605]: sim Apr 20 15:58:44 user nova-compute[71605]: kc705 Apr 20 15:58:44 user nova-compute[71605]: ml605 Apr 20 15:58:44 user nova-compute[71605]: ml605-nommu Apr 20 15:58:44 user nova-compute[71605]: virt Apr 20 15:58:44 user nova-compute[71605]: lx60-nommu Apr 20 15:58:44 user nova-compute[71605]: lx200 Apr 20 15:58:44 user nova-compute[71605]: lx200-nommu Apr 20 15:58:44 user nova-compute[71605]: lx60 Apr 20 15:58:44 user nova-compute[71605]: kc705-nommu Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: hvm Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: 32 Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-xtensaeb Apr 20 15:58:44 user nova-compute[71605]: sim Apr 20 15:58:44 user nova-compute[71605]: kc705 Apr 20 15:58:44 user nova-compute[71605]: ml605 Apr 20 15:58:44 user nova-compute[71605]: ml605-nommu Apr 20 15:58:44 user nova-compute[71605]: virt Apr 20 15:58:44 user nova-compute[71605]: lx60-nommu Apr 20 15:58:44 user nova-compute[71605]: lx200 Apr 20 15:58:44 user nova-compute[71605]: lx200-nommu Apr 20 15:58:44 user nova-compute[71605]: lx60 Apr 20 15:58:44 user nova-compute[71605]: kc705-nommu Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Getting domain capabilities for alpha via machine types: {None} {{(pid=71605) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Error from libvirt when retrieving domain capabilities for arch alpha / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-alpha' on this host {{(pid=71605) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Getting domain capabilities for armv6l via machine types: {'virt', None} {{(pid=71605) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Error from libvirt when retrieving domain capabilities for arch armv6l / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=71605) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Error from libvirt when retrieving domain capabilities for arch armv6l / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=71605) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Getting domain capabilities for armv7l via machine types: {'virt'} {{(pid=71605) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Error from libvirt when retrieving domain capabilities for arch armv7l / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=71605) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Getting domain capabilities for aarch64 via machine types: {'virt'} {{(pid=71605) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Error from libvirt when retrieving domain capabilities for arch aarch64 / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-aarch64' on this host {{(pid=71605) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Getting domain capabilities for cris via machine types: {None} {{(pid=71605) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Error from libvirt when retrieving domain capabilities for arch cris / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-cris' on this host {{(pid=71605) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Getting domain capabilities for i686 via machine types: {'ubuntu-q35', 'q35', 'ubuntu', 'pc'} {{(pid=71605) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=ubuntu-q35: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-i386 Apr 20 15:58:44 user nova-compute[71605]: kvm Apr 20 15:58:44 user nova-compute[71605]: pc-q35-jammy Apr 20 15:58:44 user nova-compute[71605]: i686 Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: /usr/share/OVMF/OVMF_CODE.fd Apr 20 15:58:44 user nova-compute[71605]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 20 15:58:44 user nova-compute[71605]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 20 15:58:44 user nova-compute[71605]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 20 15:58:44 user nova-compute[71605]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: rom Apr 20 15:58:44 user nova-compute[71605]: pflash Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: yes Apr 20 15:58:44 user nova-compute[71605]: no Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: no Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: off Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: on Apr 20 15:58:44 user nova-compute[71605]: off Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: IvyBridge-IBRS Apr 20 15:58:44 user nova-compute[71605]: Intel Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: qemu64 Apr 20 15:58:44 user nova-compute[71605]: qemu32 Apr 20 15:58:44 user nova-compute[71605]: phenom Apr 20 15:58:44 user nova-compute[71605]: pentium3 Apr 20 15:58:44 user nova-compute[71605]: pentium2 Apr 20 15:58:44 user nova-compute[71605]: pentium Apr 20 15:58:44 user nova-compute[71605]: n270 Apr 20 15:58:44 user nova-compute[71605]: kvm64 Apr 20 15:58:44 user nova-compute[71605]: kvm32 Apr 20 15:58:44 user nova-compute[71605]: coreduo Apr 20 15:58:44 user nova-compute[71605]: core2duo Apr 20 15:58:44 user nova-compute[71605]: athlon Apr 20 15:58:44 user nova-compute[71605]: Westmere-IBRS Apr 20 15:58:44 user nova-compute[71605]: Westmere Apr 20 15:58:44 user nova-compute[71605]: Snowridge Apr 20 15:58:44 user nova-compute[71605]: Skylake-Server-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Server-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Server Apr 20 15:58:44 user nova-compute[71605]: Skylake-Client-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Client-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Client Apr 20 15:58:44 user nova-compute[71605]: SandyBridge-IBRS Apr 20 15:58:44 user nova-compute[71605]: SandyBridge Apr 20 15:58:44 user nova-compute[71605]: Penryn Apr 20 15:58:44 user nova-compute[71605]: Opteron_G5 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G4 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G3 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G2 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G1 Apr 20 15:58:44 user nova-compute[71605]: Nehalem-IBRS Apr 20 15:58:44 user nova-compute[71605]: Nehalem Apr 20 15:58:44 user nova-compute[71605]: IvyBridge-IBRS Apr 20 15:58:44 user nova-compute[71605]: IvyBridge Apr 20 15:58:44 user nova-compute[71605]: Icelake-Server-noTSX Apr 20 15:58:44 user nova-compute[71605]: Icelake-Server Apr 20 15:58:44 user nova-compute[71605]: Icelake-Client-noTSX Apr 20 15:58:44 user nova-compute[71605]: Icelake-Client Apr 20 15:58:44 user nova-compute[71605]: Haswell-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Haswell-noTSX Apr 20 15:58:44 user nova-compute[71605]: Haswell-IBRS Apr 20 15:58:44 user nova-compute[71605]: Haswell Apr 20 15:58:44 user nova-compute[71605]: EPYC-Rome Apr 20 15:58:44 user nova-compute[71605]: EPYC-Milan Apr 20 15:58:44 user nova-compute[71605]: EPYC-IBPB Apr 20 15:58:44 user nova-compute[71605]: EPYC Apr 20 15:58:44 user nova-compute[71605]: Dhyana Apr 20 15:58:44 user nova-compute[71605]: Cooperlake Apr 20 15:58:44 user nova-compute[71605]: Conroe Apr 20 15:58:44 user nova-compute[71605]: Cascadelake-Server-noTSX Apr 20 15:58:44 user nova-compute[71605]: Cascadelake-Server Apr 20 15:58:44 user nova-compute[71605]: Broadwell-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Broadwell-noTSX Apr 20 15:58:44 user nova-compute[71605]: Broadwell-IBRS Apr 20 15:58:44 user nova-compute[71605]: Broadwell Apr 20 15:58:44 user nova-compute[71605]: 486 Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: file Apr 20 15:58:44 user nova-compute[71605]: anonymous Apr 20 15:58:44 user nova-compute[71605]: memfd Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: disk Apr 20 15:58:44 user nova-compute[71605]: cdrom Apr 20 15:58:44 user nova-compute[71605]: floppy Apr 20 15:58:44 user nova-compute[71605]: lun Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: fdc Apr 20 15:58:44 user nova-compute[71605]: scsi Apr 20 15:58:44 user nova-compute[71605]: virtio Apr 20 15:58:44 user nova-compute[71605]: usb Apr 20 15:58:44 user nova-compute[71605]: sata Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: virtio Apr 20 15:58:44 user nova-compute[71605]: virtio-transitional Apr 20 15:58:44 user nova-compute[71605]: virtio-non-transitional Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: sdl Apr 20 15:58:44 user nova-compute[71605]: vnc Apr 20 15:58:44 user nova-compute[71605]: spice Apr 20 15:58:44 user nova-compute[71605]: egl-headless Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: subsystem Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: default Apr 20 15:58:44 user nova-compute[71605]: mandatory Apr 20 15:58:44 user nova-compute[71605]: requisite Apr 20 15:58:44 user nova-compute[71605]: optional Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: usb Apr 20 15:58:44 user nova-compute[71605]: pci Apr 20 15:58:44 user nova-compute[71605]: scsi Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: virtio Apr 20 15:58:44 user nova-compute[71605]: virtio-transitional Apr 20 15:58:44 user nova-compute[71605]: virtio-non-transitional Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: random Apr 20 15:58:44 user nova-compute[71605]: egd Apr 20 15:58:44 user nova-compute[71605]: builtin Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: path Apr 20 15:58:44 user nova-compute[71605]: handle Apr 20 15:58:44 user nova-compute[71605]: virtiofs Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: tpm-tis Apr 20 15:58:44 user nova-compute[71605]: tpm-crb Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: passthrough Apr 20 15:58:44 user nova-compute[71605]: emulator Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: {{(pid=71605) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-i386 Apr 20 15:58:44 user nova-compute[71605]: kvm Apr 20 15:58:44 user nova-compute[71605]: pc-q35-6.2 Apr 20 15:58:44 user nova-compute[71605]: i686 Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: /usr/share/OVMF/OVMF_CODE.fd Apr 20 15:58:44 user nova-compute[71605]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 20 15:58:44 user nova-compute[71605]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 20 15:58:44 user nova-compute[71605]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 20 15:58:44 user nova-compute[71605]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: rom Apr 20 15:58:44 user nova-compute[71605]: pflash Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: yes Apr 20 15:58:44 user nova-compute[71605]: no Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: no Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: off Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: on Apr 20 15:58:44 user nova-compute[71605]: off Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: IvyBridge-IBRS Apr 20 15:58:44 user nova-compute[71605]: Intel Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: qemu64 Apr 20 15:58:44 user nova-compute[71605]: qemu32 Apr 20 15:58:44 user nova-compute[71605]: phenom Apr 20 15:58:44 user nova-compute[71605]: pentium3 Apr 20 15:58:44 user nova-compute[71605]: pentium2 Apr 20 15:58:44 user nova-compute[71605]: pentium Apr 20 15:58:44 user nova-compute[71605]: n270 Apr 20 15:58:44 user nova-compute[71605]: kvm64 Apr 20 15:58:44 user nova-compute[71605]: kvm32 Apr 20 15:58:44 user nova-compute[71605]: coreduo Apr 20 15:58:44 user nova-compute[71605]: core2duo Apr 20 15:58:44 user nova-compute[71605]: athlon Apr 20 15:58:44 user nova-compute[71605]: Westmere-IBRS Apr 20 15:58:44 user nova-compute[71605]: Westmere Apr 20 15:58:44 user nova-compute[71605]: Snowridge Apr 20 15:58:44 user nova-compute[71605]: Skylake-Server-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Server-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Server Apr 20 15:58:44 user nova-compute[71605]: Skylake-Client-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Client-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Client Apr 20 15:58:44 user nova-compute[71605]: SandyBridge-IBRS Apr 20 15:58:44 user nova-compute[71605]: SandyBridge Apr 20 15:58:44 user nova-compute[71605]: Penryn Apr 20 15:58:44 user nova-compute[71605]: Opteron_G5 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G4 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G3 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G2 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G1 Apr 20 15:58:44 user nova-compute[71605]: Nehalem-IBRS Apr 20 15:58:44 user nova-compute[71605]: Nehalem Apr 20 15:58:44 user nova-compute[71605]: IvyBridge-IBRS Apr 20 15:58:44 user nova-compute[71605]: IvyBridge Apr 20 15:58:44 user nova-compute[71605]: Icelake-Server-noTSX Apr 20 15:58:44 user nova-compute[71605]: Icelake-Server Apr 20 15:58:44 user nova-compute[71605]: Icelake-Client-noTSX Apr 20 15:58:44 user nova-compute[71605]: Icelake-Client Apr 20 15:58:44 user nova-compute[71605]: Haswell-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Haswell-noTSX Apr 20 15:58:44 user nova-compute[71605]: Haswell-IBRS Apr 20 15:58:44 user nova-compute[71605]: Haswell Apr 20 15:58:44 user nova-compute[71605]: EPYC-Rome Apr 20 15:58:44 user nova-compute[71605]: EPYC-Milan Apr 20 15:58:44 user nova-compute[71605]: EPYC-IBPB Apr 20 15:58:44 user nova-compute[71605]: EPYC Apr 20 15:58:44 user nova-compute[71605]: Dhyana Apr 20 15:58:44 user nova-compute[71605]: Cooperlake Apr 20 15:58:44 user nova-compute[71605]: Conroe Apr 20 15:58:44 user nova-compute[71605]: Cascadelake-Server-noTSX Apr 20 15:58:44 user nova-compute[71605]: Cascadelake-Server Apr 20 15:58:44 user nova-compute[71605]: Broadwell-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Broadwell-noTSX Apr 20 15:58:44 user nova-compute[71605]: Broadwell-IBRS Apr 20 15:58:44 user nova-compute[71605]: Broadwell Apr 20 15:58:44 user nova-compute[71605]: 486 Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: file Apr 20 15:58:44 user nova-compute[71605]: anonymous Apr 20 15:58:44 user nova-compute[71605]: memfd Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: disk Apr 20 15:58:44 user nova-compute[71605]: cdrom Apr 20 15:58:44 user nova-compute[71605]: floppy Apr 20 15:58:44 user nova-compute[71605]: lun Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: fdc Apr 20 15:58:44 user nova-compute[71605]: scsi Apr 20 15:58:44 user nova-compute[71605]: virtio Apr 20 15:58:44 user nova-compute[71605]: usb Apr 20 15:58:44 user nova-compute[71605]: sata Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: virtio Apr 20 15:58:44 user nova-compute[71605]: virtio-transitional Apr 20 15:58:44 user nova-compute[71605]: virtio-non-transitional Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: sdl Apr 20 15:58:44 user nova-compute[71605]: vnc Apr 20 15:58:44 user nova-compute[71605]: spice Apr 20 15:58:44 user nova-compute[71605]: egl-headless Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: subsystem Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: default Apr 20 15:58:44 user nova-compute[71605]: mandatory Apr 20 15:58:44 user nova-compute[71605]: requisite Apr 20 15:58:44 user nova-compute[71605]: optional Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: usb Apr 20 15:58:44 user nova-compute[71605]: pci Apr 20 15:58:44 user nova-compute[71605]: scsi Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: virtio Apr 20 15:58:44 user nova-compute[71605]: virtio-transitional Apr 20 15:58:44 user nova-compute[71605]: virtio-non-transitional Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: random Apr 20 15:58:44 user nova-compute[71605]: egd Apr 20 15:58:44 user nova-compute[71605]: builtin Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: path Apr 20 15:58:44 user nova-compute[71605]: handle Apr 20 15:58:44 user nova-compute[71605]: virtiofs Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: tpm-tis Apr 20 15:58:44 user nova-compute[71605]: tpm-crb Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: passthrough Apr 20 15:58:44 user nova-compute[71605]: emulator Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: {{(pid=71605) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=ubuntu: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-i386 Apr 20 15:58:44 user nova-compute[71605]: kvm Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-jammy Apr 20 15:58:44 user nova-compute[71605]: i686 Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: /usr/share/OVMF/OVMF_CODE.fd Apr 20 15:58:44 user nova-compute[71605]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 20 15:58:44 user nova-compute[71605]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 20 15:58:44 user nova-compute[71605]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 20 15:58:44 user nova-compute[71605]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: rom Apr 20 15:58:44 user nova-compute[71605]: pflash Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: yes Apr 20 15:58:44 user nova-compute[71605]: no Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: no Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: off Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: on Apr 20 15:58:44 user nova-compute[71605]: off Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: IvyBridge-IBRS Apr 20 15:58:44 user nova-compute[71605]: Intel Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: qemu64 Apr 20 15:58:44 user nova-compute[71605]: qemu32 Apr 20 15:58:44 user nova-compute[71605]: phenom Apr 20 15:58:44 user nova-compute[71605]: pentium3 Apr 20 15:58:44 user nova-compute[71605]: pentium2 Apr 20 15:58:44 user nova-compute[71605]: pentium Apr 20 15:58:44 user nova-compute[71605]: n270 Apr 20 15:58:44 user nova-compute[71605]: kvm64 Apr 20 15:58:44 user nova-compute[71605]: kvm32 Apr 20 15:58:44 user nova-compute[71605]: coreduo Apr 20 15:58:44 user nova-compute[71605]: core2duo Apr 20 15:58:44 user nova-compute[71605]: athlon Apr 20 15:58:44 user nova-compute[71605]: Westmere-IBRS Apr 20 15:58:44 user nova-compute[71605]: Westmere Apr 20 15:58:44 user nova-compute[71605]: Snowridge Apr 20 15:58:44 user nova-compute[71605]: Skylake-Server-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Server-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Server Apr 20 15:58:44 user nova-compute[71605]: Skylake-Client-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Client-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Client Apr 20 15:58:44 user nova-compute[71605]: SandyBridge-IBRS Apr 20 15:58:44 user nova-compute[71605]: SandyBridge Apr 20 15:58:44 user nova-compute[71605]: Penryn Apr 20 15:58:44 user nova-compute[71605]: Opteron_G5 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G4 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G3 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G2 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G1 Apr 20 15:58:44 user nova-compute[71605]: Nehalem-IBRS Apr 20 15:58:44 user nova-compute[71605]: Nehalem Apr 20 15:58:44 user nova-compute[71605]: IvyBridge-IBRS Apr 20 15:58:44 user nova-compute[71605]: IvyBridge Apr 20 15:58:44 user nova-compute[71605]: Icelake-Server-noTSX Apr 20 15:58:44 user nova-compute[71605]: Icelake-Server Apr 20 15:58:44 user nova-compute[71605]: Icelake-Client-noTSX Apr 20 15:58:44 user nova-compute[71605]: Icelake-Client Apr 20 15:58:44 user nova-compute[71605]: Haswell-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Haswell-noTSX Apr 20 15:58:44 user nova-compute[71605]: Haswell-IBRS Apr 20 15:58:44 user nova-compute[71605]: Haswell Apr 20 15:58:44 user nova-compute[71605]: EPYC-Rome Apr 20 15:58:44 user nova-compute[71605]: EPYC-Milan Apr 20 15:58:44 user nova-compute[71605]: EPYC-IBPB Apr 20 15:58:44 user nova-compute[71605]: EPYC Apr 20 15:58:44 user nova-compute[71605]: Dhyana Apr 20 15:58:44 user nova-compute[71605]: Cooperlake Apr 20 15:58:44 user nova-compute[71605]: Conroe Apr 20 15:58:44 user nova-compute[71605]: Cascadelake-Server-noTSX Apr 20 15:58:44 user nova-compute[71605]: Cascadelake-Server Apr 20 15:58:44 user nova-compute[71605]: Broadwell-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Broadwell-noTSX Apr 20 15:58:44 user nova-compute[71605]: Broadwell-IBRS Apr 20 15:58:44 user nova-compute[71605]: Broadwell Apr 20 15:58:44 user nova-compute[71605]: 486 Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: file Apr 20 15:58:44 user nova-compute[71605]: anonymous Apr 20 15:58:44 user nova-compute[71605]: memfd Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: disk Apr 20 15:58:44 user nova-compute[71605]: cdrom Apr 20 15:58:44 user nova-compute[71605]: floppy Apr 20 15:58:44 user nova-compute[71605]: lun Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: ide Apr 20 15:58:44 user nova-compute[71605]: fdc Apr 20 15:58:44 user nova-compute[71605]: scsi Apr 20 15:58:44 user nova-compute[71605]: virtio Apr 20 15:58:44 user nova-compute[71605]: usb Apr 20 15:58:44 user nova-compute[71605]: sata Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: virtio Apr 20 15:58:44 user nova-compute[71605]: virtio-transitional Apr 20 15:58:44 user nova-compute[71605]: virtio-non-transitional Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: sdl Apr 20 15:58:44 user nova-compute[71605]: vnc Apr 20 15:58:44 user nova-compute[71605]: spice Apr 20 15:58:44 user nova-compute[71605]: egl-headless Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: subsystem Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: default Apr 20 15:58:44 user nova-compute[71605]: mandatory Apr 20 15:58:44 user nova-compute[71605]: requisite Apr 20 15:58:44 user nova-compute[71605]: optional Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: usb Apr 20 15:58:44 user nova-compute[71605]: pci Apr 20 15:58:44 user nova-compute[71605]: scsi Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: virtio Apr 20 15:58:44 user nova-compute[71605]: virtio-transitional Apr 20 15:58:44 user nova-compute[71605]: virtio-non-transitional Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: random Apr 20 15:58:44 user nova-compute[71605]: egd Apr 20 15:58:44 user nova-compute[71605]: builtin Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: path Apr 20 15:58:44 user nova-compute[71605]: handle Apr 20 15:58:44 user nova-compute[71605]: virtiofs Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: tpm-tis Apr 20 15:58:44 user nova-compute[71605]: tpm-crb Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: passthrough Apr 20 15:58:44 user nova-compute[71605]: emulator Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: {{(pid=71605) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-i386 Apr 20 15:58:44 user nova-compute[71605]: kvm Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-6.2 Apr 20 15:58:44 user nova-compute[71605]: i686 Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: /usr/share/OVMF/OVMF_CODE.fd Apr 20 15:58:44 user nova-compute[71605]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 20 15:58:44 user nova-compute[71605]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 20 15:58:44 user nova-compute[71605]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 20 15:58:44 user nova-compute[71605]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: rom Apr 20 15:58:44 user nova-compute[71605]: pflash Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: yes Apr 20 15:58:44 user nova-compute[71605]: no Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: no Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: off Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: on Apr 20 15:58:44 user nova-compute[71605]: off Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: IvyBridge-IBRS Apr 20 15:58:44 user nova-compute[71605]: Intel Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: qemu64 Apr 20 15:58:44 user nova-compute[71605]: qemu32 Apr 20 15:58:44 user nova-compute[71605]: phenom Apr 20 15:58:44 user nova-compute[71605]: pentium3 Apr 20 15:58:44 user nova-compute[71605]: pentium2 Apr 20 15:58:44 user nova-compute[71605]: pentium Apr 20 15:58:44 user nova-compute[71605]: n270 Apr 20 15:58:44 user nova-compute[71605]: kvm64 Apr 20 15:58:44 user nova-compute[71605]: kvm32 Apr 20 15:58:44 user nova-compute[71605]: coreduo Apr 20 15:58:44 user nova-compute[71605]: core2duo Apr 20 15:58:44 user nova-compute[71605]: athlon Apr 20 15:58:44 user nova-compute[71605]: Westmere-IBRS Apr 20 15:58:44 user nova-compute[71605]: Westmere Apr 20 15:58:44 user nova-compute[71605]: Snowridge Apr 20 15:58:44 user nova-compute[71605]: Skylake-Server-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Server-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Server Apr 20 15:58:44 user nova-compute[71605]: Skylake-Client-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Client-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Client Apr 20 15:58:44 user nova-compute[71605]: SandyBridge-IBRS Apr 20 15:58:44 user nova-compute[71605]: SandyBridge Apr 20 15:58:44 user nova-compute[71605]: Penryn Apr 20 15:58:44 user nova-compute[71605]: Opteron_G5 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G4 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G3 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G2 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G1 Apr 20 15:58:44 user nova-compute[71605]: Nehalem-IBRS Apr 20 15:58:44 user nova-compute[71605]: Nehalem Apr 20 15:58:44 user nova-compute[71605]: IvyBridge-IBRS Apr 20 15:58:44 user nova-compute[71605]: IvyBridge Apr 20 15:58:44 user nova-compute[71605]: Icelake-Server-noTSX Apr 20 15:58:44 user nova-compute[71605]: Icelake-Server Apr 20 15:58:44 user nova-compute[71605]: Icelake-Client-noTSX Apr 20 15:58:44 user nova-compute[71605]: Icelake-Client Apr 20 15:58:44 user nova-compute[71605]: Haswell-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Haswell-noTSX Apr 20 15:58:44 user nova-compute[71605]: Haswell-IBRS Apr 20 15:58:44 user nova-compute[71605]: Haswell Apr 20 15:58:44 user nova-compute[71605]: EPYC-Rome Apr 20 15:58:44 user nova-compute[71605]: EPYC-Milan Apr 20 15:58:44 user nova-compute[71605]: EPYC-IBPB Apr 20 15:58:44 user nova-compute[71605]: EPYC Apr 20 15:58:44 user nova-compute[71605]: Dhyana Apr 20 15:58:44 user nova-compute[71605]: Cooperlake Apr 20 15:58:44 user nova-compute[71605]: Conroe Apr 20 15:58:44 user nova-compute[71605]: Cascadelake-Server-noTSX Apr 20 15:58:44 user nova-compute[71605]: Cascadelake-Server Apr 20 15:58:44 user nova-compute[71605]: Broadwell-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Broadwell-noTSX Apr 20 15:58:44 user nova-compute[71605]: Broadwell-IBRS Apr 20 15:58:44 user nova-compute[71605]: Broadwell Apr 20 15:58:44 user nova-compute[71605]: 486 Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: file Apr 20 15:58:44 user nova-compute[71605]: anonymous Apr 20 15:58:44 user nova-compute[71605]: memfd Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: disk Apr 20 15:58:44 user nova-compute[71605]: cdrom Apr 20 15:58:44 user nova-compute[71605]: floppy Apr 20 15:58:44 user nova-compute[71605]: lun Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: ide Apr 20 15:58:44 user nova-compute[71605]: fdc Apr 20 15:58:44 user nova-compute[71605]: scsi Apr 20 15:58:44 user nova-compute[71605]: virtio Apr 20 15:58:44 user nova-compute[71605]: usb Apr 20 15:58:44 user nova-compute[71605]: sata Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: virtio Apr 20 15:58:44 user nova-compute[71605]: virtio-transitional Apr 20 15:58:44 user nova-compute[71605]: virtio-non-transitional Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: sdl Apr 20 15:58:44 user nova-compute[71605]: vnc Apr 20 15:58:44 user nova-compute[71605]: spice Apr 20 15:58:44 user nova-compute[71605]: egl-headless Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: subsystem Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: default Apr 20 15:58:44 user nova-compute[71605]: mandatory Apr 20 15:58:44 user nova-compute[71605]: requisite Apr 20 15:58:44 user nova-compute[71605]: optional Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: usb Apr 20 15:58:44 user nova-compute[71605]: pci Apr 20 15:58:44 user nova-compute[71605]: scsi Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: virtio Apr 20 15:58:44 user nova-compute[71605]: virtio-transitional Apr 20 15:58:44 user nova-compute[71605]: virtio-non-transitional Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: random Apr 20 15:58:44 user nova-compute[71605]: egd Apr 20 15:58:44 user nova-compute[71605]: builtin Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: path Apr 20 15:58:44 user nova-compute[71605]: handle Apr 20 15:58:44 user nova-compute[71605]: virtiofs Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: tpm-tis Apr 20 15:58:44 user nova-compute[71605]: tpm-crb Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: passthrough Apr 20 15:58:44 user nova-compute[71605]: emulator Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: {{(pid=71605) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Getting domain capabilities for m68k via machine types: {'virt', None} {{(pid=71605) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Error from libvirt when retrieving domain capabilities for arch m68k / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-m68k' on this host {{(pid=71605) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Error from libvirt when retrieving domain capabilities for arch m68k / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-m68k' on this host {{(pid=71605) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Getting domain capabilities for microblaze via machine types: {None} {{(pid=71605) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Error from libvirt when retrieving domain capabilities for arch microblaze / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-microblaze' on this host {{(pid=71605) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Getting domain capabilities for microblazeel via machine types: {None} {{(pid=71605) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Error from libvirt when retrieving domain capabilities for arch microblazeel / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-microblazeel' on this host {{(pid=71605) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Getting domain capabilities for mips via machine types: {None} {{(pid=71605) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Error from libvirt when retrieving domain capabilities for arch mips / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips' on this host {{(pid=71605) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Getting domain capabilities for mipsel via machine types: {None} {{(pid=71605) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Error from libvirt when retrieving domain capabilities for arch mipsel / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mipsel' on this host {{(pid=71605) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Getting domain capabilities for mips64 via machine types: {None} {{(pid=71605) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Error from libvirt when retrieving domain capabilities for arch mips64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips64' on this host {{(pid=71605) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Getting domain capabilities for mips64el via machine types: {None} {{(pid=71605) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Error from libvirt when retrieving domain capabilities for arch mips64el / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips64el' on this host {{(pid=71605) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Getting domain capabilities for ppc via machine types: {None} {{(pid=71605) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Error from libvirt when retrieving domain capabilities for arch ppc / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc' on this host {{(pid=71605) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Getting domain capabilities for ppc64 via machine types: {'pseries', None, 'powernv'} {{(pid=71605) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type pseries: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=71605) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=71605) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type powernv: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=71605) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Getting domain capabilities for ppc64le via machine types: {'pseries', 'powernv'} {{(pid=71605) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Error from libvirt when retrieving domain capabilities for arch ppc64le / virt_type kvm / machine_type pseries: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64le' on this host {{(pid=71605) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Error from libvirt when retrieving domain capabilities for arch ppc64le / virt_type kvm / machine_type powernv: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64le' on this host {{(pid=71605) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Getting domain capabilities for riscv32 via machine types: {None} {{(pid=71605) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Error from libvirt when retrieving domain capabilities for arch riscv32 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-riscv32' on this host {{(pid=71605) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Getting domain capabilities for riscv64 via machine types: {None} {{(pid=71605) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Error from libvirt when retrieving domain capabilities for arch riscv64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-riscv64' on this host {{(pid=71605) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Getting domain capabilities for s390x via machine types: {'s390-ccw-virtio'} {{(pid=71605) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Error from libvirt when retrieving domain capabilities for arch s390x / virt_type kvm / machine_type s390-ccw-virtio: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-s390x' on this host {{(pid=71605) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Getting domain capabilities for sh4 via machine types: {None} {{(pid=71605) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Error from libvirt when retrieving domain capabilities for arch sh4 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sh4' on this host {{(pid=71605) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Getting domain capabilities for sh4eb via machine types: {None} {{(pid=71605) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Error from libvirt when retrieving domain capabilities for arch sh4eb / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sh4eb' on this host {{(pid=71605) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Getting domain capabilities for sparc via machine types: {None} {{(pid=71605) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Error from libvirt when retrieving domain capabilities for arch sparc / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sparc' on this host {{(pid=71605) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Getting domain capabilities for sparc64 via machine types: {None} {{(pid=71605) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Error from libvirt when retrieving domain capabilities for arch sparc64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sparc64' on this host {{(pid=71605) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Getting domain capabilities for x86_64 via machine types: {'ubuntu-q35', 'q35', 'ubuntu', 'pc'} {{(pid=71605) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=ubuntu-q35: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-x86_64 Apr 20 15:58:44 user nova-compute[71605]: kvm Apr 20 15:58:44 user nova-compute[71605]: pc-q35-jammy Apr 20 15:58:44 user nova-compute[71605]: x86_64 Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: efi Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: /usr/share/OVMF/OVMF_CODE_4M.ms.fd Apr 20 15:58:44 user nova-compute[71605]: /usr/share/OVMF/OVMF_CODE_4M.secboot.fd Apr 20 15:58:44 user nova-compute[71605]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: rom Apr 20 15:58:44 user nova-compute[71605]: pflash Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: yes Apr 20 15:58:44 user nova-compute[71605]: no Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: yes Apr 20 15:58:44 user nova-compute[71605]: no Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: on Apr 20 15:58:44 user nova-compute[71605]: off Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: on Apr 20 15:58:44 user nova-compute[71605]: off Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: IvyBridge-IBRS Apr 20 15:58:44 user nova-compute[71605]: Intel Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: qemu64 Apr 20 15:58:44 user nova-compute[71605]: qemu32 Apr 20 15:58:44 user nova-compute[71605]: phenom Apr 20 15:58:44 user nova-compute[71605]: pentium3 Apr 20 15:58:44 user nova-compute[71605]: pentium2 Apr 20 15:58:44 user nova-compute[71605]: pentium Apr 20 15:58:44 user nova-compute[71605]: n270 Apr 20 15:58:44 user nova-compute[71605]: kvm64 Apr 20 15:58:44 user nova-compute[71605]: kvm32 Apr 20 15:58:44 user nova-compute[71605]: coreduo Apr 20 15:58:44 user nova-compute[71605]: core2duo Apr 20 15:58:44 user nova-compute[71605]: athlon Apr 20 15:58:44 user nova-compute[71605]: Westmere-IBRS Apr 20 15:58:44 user nova-compute[71605]: Westmere Apr 20 15:58:44 user nova-compute[71605]: Snowridge Apr 20 15:58:44 user nova-compute[71605]: Skylake-Server-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Server-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Server Apr 20 15:58:44 user nova-compute[71605]: Skylake-Client-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Client-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Client Apr 20 15:58:44 user nova-compute[71605]: SandyBridge-IBRS Apr 20 15:58:44 user nova-compute[71605]: SandyBridge Apr 20 15:58:44 user nova-compute[71605]: Penryn Apr 20 15:58:44 user nova-compute[71605]: Opteron_G5 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G4 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G3 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G2 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G1 Apr 20 15:58:44 user nova-compute[71605]: Nehalem-IBRS Apr 20 15:58:44 user nova-compute[71605]: Nehalem Apr 20 15:58:44 user nova-compute[71605]: IvyBridge-IBRS Apr 20 15:58:44 user nova-compute[71605]: IvyBridge Apr 20 15:58:44 user nova-compute[71605]: Icelake-Server-noTSX Apr 20 15:58:44 user nova-compute[71605]: Icelake-Server Apr 20 15:58:44 user nova-compute[71605]: Icelake-Client-noTSX Apr 20 15:58:44 user nova-compute[71605]: Icelake-Client Apr 20 15:58:44 user nova-compute[71605]: Haswell-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Haswell-noTSX Apr 20 15:58:44 user nova-compute[71605]: Haswell-IBRS Apr 20 15:58:44 user nova-compute[71605]: Haswell Apr 20 15:58:44 user nova-compute[71605]: EPYC-Rome Apr 20 15:58:44 user nova-compute[71605]: EPYC-Milan Apr 20 15:58:44 user nova-compute[71605]: EPYC-IBPB Apr 20 15:58:44 user nova-compute[71605]: EPYC Apr 20 15:58:44 user nova-compute[71605]: Dhyana Apr 20 15:58:44 user nova-compute[71605]: Cooperlake Apr 20 15:58:44 user nova-compute[71605]: Conroe Apr 20 15:58:44 user nova-compute[71605]: Cascadelake-Server-noTSX Apr 20 15:58:44 user nova-compute[71605]: Cascadelake-Server Apr 20 15:58:44 user nova-compute[71605]: Broadwell-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Broadwell-noTSX Apr 20 15:58:44 user nova-compute[71605]: Broadwell-IBRS Apr 20 15:58:44 user nova-compute[71605]: Broadwell Apr 20 15:58:44 user nova-compute[71605]: 486 Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: file Apr 20 15:58:44 user nova-compute[71605]: anonymous Apr 20 15:58:44 user nova-compute[71605]: memfd Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: disk Apr 20 15:58:44 user nova-compute[71605]: cdrom Apr 20 15:58:44 user nova-compute[71605]: floppy Apr 20 15:58:44 user nova-compute[71605]: lun Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: fdc Apr 20 15:58:44 user nova-compute[71605]: scsi Apr 20 15:58:44 user nova-compute[71605]: virtio Apr 20 15:58:44 user nova-compute[71605]: usb Apr 20 15:58:44 user nova-compute[71605]: sata Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: virtio Apr 20 15:58:44 user nova-compute[71605]: virtio-transitional Apr 20 15:58:44 user nova-compute[71605]: virtio-non-transitional Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: sdl Apr 20 15:58:44 user nova-compute[71605]: vnc Apr 20 15:58:44 user nova-compute[71605]: spice Apr 20 15:58:44 user nova-compute[71605]: egl-headless Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: subsystem Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: default Apr 20 15:58:44 user nova-compute[71605]: mandatory Apr 20 15:58:44 user nova-compute[71605]: requisite Apr 20 15:58:44 user nova-compute[71605]: optional Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: usb Apr 20 15:58:44 user nova-compute[71605]: pci Apr 20 15:58:44 user nova-compute[71605]: scsi Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: virtio Apr 20 15:58:44 user nova-compute[71605]: virtio-transitional Apr 20 15:58:44 user nova-compute[71605]: virtio-non-transitional Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: random Apr 20 15:58:44 user nova-compute[71605]: egd Apr 20 15:58:44 user nova-compute[71605]: builtin Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: path Apr 20 15:58:44 user nova-compute[71605]: handle Apr 20 15:58:44 user nova-compute[71605]: virtiofs Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: tpm-tis Apr 20 15:58:44 user nova-compute[71605]: tpm-crb Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: passthrough Apr 20 15:58:44 user nova-compute[71605]: emulator Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: {{(pid=71605) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-x86_64 Apr 20 15:58:44 user nova-compute[71605]: kvm Apr 20 15:58:44 user nova-compute[71605]: pc-q35-6.2 Apr 20 15:58:44 user nova-compute[71605]: x86_64 Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: efi Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: /usr/share/OVMF/OVMF_CODE_4M.ms.fd Apr 20 15:58:44 user nova-compute[71605]: /usr/share/OVMF/OVMF_CODE_4M.secboot.fd Apr 20 15:58:44 user nova-compute[71605]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: rom Apr 20 15:58:44 user nova-compute[71605]: pflash Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: yes Apr 20 15:58:44 user nova-compute[71605]: no Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: yes Apr 20 15:58:44 user nova-compute[71605]: no Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: on Apr 20 15:58:44 user nova-compute[71605]: off Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: on Apr 20 15:58:44 user nova-compute[71605]: off Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: IvyBridge-IBRS Apr 20 15:58:44 user nova-compute[71605]: Intel Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: qemu64 Apr 20 15:58:44 user nova-compute[71605]: qemu32 Apr 20 15:58:44 user nova-compute[71605]: phenom Apr 20 15:58:44 user nova-compute[71605]: pentium3 Apr 20 15:58:44 user nova-compute[71605]: pentium2 Apr 20 15:58:44 user nova-compute[71605]: pentium Apr 20 15:58:44 user nova-compute[71605]: n270 Apr 20 15:58:44 user nova-compute[71605]: kvm64 Apr 20 15:58:44 user nova-compute[71605]: kvm32 Apr 20 15:58:44 user nova-compute[71605]: coreduo Apr 20 15:58:44 user nova-compute[71605]: core2duo Apr 20 15:58:44 user nova-compute[71605]: athlon Apr 20 15:58:44 user nova-compute[71605]: Westmere-IBRS Apr 20 15:58:44 user nova-compute[71605]: Westmere Apr 20 15:58:44 user nova-compute[71605]: Snowridge Apr 20 15:58:44 user nova-compute[71605]: Skylake-Server-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Server-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Server Apr 20 15:58:44 user nova-compute[71605]: Skylake-Client-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Client-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Client Apr 20 15:58:44 user nova-compute[71605]: SandyBridge-IBRS Apr 20 15:58:44 user nova-compute[71605]: SandyBridge Apr 20 15:58:44 user nova-compute[71605]: Penryn Apr 20 15:58:44 user nova-compute[71605]: Opteron_G5 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G4 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G3 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G2 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G1 Apr 20 15:58:44 user nova-compute[71605]: Nehalem-IBRS Apr 20 15:58:44 user nova-compute[71605]: Nehalem Apr 20 15:58:44 user nova-compute[71605]: IvyBridge-IBRS Apr 20 15:58:44 user nova-compute[71605]: IvyBridge Apr 20 15:58:44 user nova-compute[71605]: Icelake-Server-noTSX Apr 20 15:58:44 user nova-compute[71605]: Icelake-Server Apr 20 15:58:44 user nova-compute[71605]: Icelake-Client-noTSX Apr 20 15:58:44 user nova-compute[71605]: Icelake-Client Apr 20 15:58:44 user nova-compute[71605]: Haswell-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Haswell-noTSX Apr 20 15:58:44 user nova-compute[71605]: Haswell-IBRS Apr 20 15:58:44 user nova-compute[71605]: Haswell Apr 20 15:58:44 user nova-compute[71605]: EPYC-Rome Apr 20 15:58:44 user nova-compute[71605]: EPYC-Milan Apr 20 15:58:44 user nova-compute[71605]: EPYC-IBPB Apr 20 15:58:44 user nova-compute[71605]: EPYC Apr 20 15:58:44 user nova-compute[71605]: Dhyana Apr 20 15:58:44 user nova-compute[71605]: Cooperlake Apr 20 15:58:44 user nova-compute[71605]: Conroe Apr 20 15:58:44 user nova-compute[71605]: Cascadelake-Server-noTSX Apr 20 15:58:44 user nova-compute[71605]: Cascadelake-Server Apr 20 15:58:44 user nova-compute[71605]: Broadwell-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Broadwell-noTSX Apr 20 15:58:44 user nova-compute[71605]: Broadwell-IBRS Apr 20 15:58:44 user nova-compute[71605]: Broadwell Apr 20 15:58:44 user nova-compute[71605]: 486 Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: file Apr 20 15:58:44 user nova-compute[71605]: anonymous Apr 20 15:58:44 user nova-compute[71605]: memfd Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: disk Apr 20 15:58:44 user nova-compute[71605]: cdrom Apr 20 15:58:44 user nova-compute[71605]: floppy Apr 20 15:58:44 user nova-compute[71605]: lun Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: fdc Apr 20 15:58:44 user nova-compute[71605]: scsi Apr 20 15:58:44 user nova-compute[71605]: virtio Apr 20 15:58:44 user nova-compute[71605]: usb Apr 20 15:58:44 user nova-compute[71605]: sata Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: virtio Apr 20 15:58:44 user nova-compute[71605]: virtio-transitional Apr 20 15:58:44 user nova-compute[71605]: virtio-non-transitional Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: sdl Apr 20 15:58:44 user nova-compute[71605]: vnc Apr 20 15:58:44 user nova-compute[71605]: spice Apr 20 15:58:44 user nova-compute[71605]: egl-headless Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: subsystem Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: default Apr 20 15:58:44 user nova-compute[71605]: mandatory Apr 20 15:58:44 user nova-compute[71605]: requisite Apr 20 15:58:44 user nova-compute[71605]: optional Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: usb Apr 20 15:58:44 user nova-compute[71605]: pci Apr 20 15:58:44 user nova-compute[71605]: scsi Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: virtio Apr 20 15:58:44 user nova-compute[71605]: virtio-transitional Apr 20 15:58:44 user nova-compute[71605]: virtio-non-transitional Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: random Apr 20 15:58:44 user nova-compute[71605]: egd Apr 20 15:58:44 user nova-compute[71605]: builtin Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: path Apr 20 15:58:44 user nova-compute[71605]: handle Apr 20 15:58:44 user nova-compute[71605]: virtiofs Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: tpm-tis Apr 20 15:58:44 user nova-compute[71605]: tpm-crb Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: passthrough Apr 20 15:58:44 user nova-compute[71605]: emulator Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: {{(pid=71605) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=ubuntu: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-x86_64 Apr 20 15:58:44 user nova-compute[71605]: kvm Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-jammy Apr 20 15:58:44 user nova-compute[71605]: x86_64 Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: efi Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: rom Apr 20 15:58:44 user nova-compute[71605]: pflash Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: yes Apr 20 15:58:44 user nova-compute[71605]: no Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: no Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: on Apr 20 15:58:44 user nova-compute[71605]: off Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: on Apr 20 15:58:44 user nova-compute[71605]: off Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: IvyBridge-IBRS Apr 20 15:58:44 user nova-compute[71605]: Intel Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: qemu64 Apr 20 15:58:44 user nova-compute[71605]: qemu32 Apr 20 15:58:44 user nova-compute[71605]: phenom Apr 20 15:58:44 user nova-compute[71605]: pentium3 Apr 20 15:58:44 user nova-compute[71605]: pentium2 Apr 20 15:58:44 user nova-compute[71605]: pentium Apr 20 15:58:44 user nova-compute[71605]: n270 Apr 20 15:58:44 user nova-compute[71605]: kvm64 Apr 20 15:58:44 user nova-compute[71605]: kvm32 Apr 20 15:58:44 user nova-compute[71605]: coreduo Apr 20 15:58:44 user nova-compute[71605]: core2duo Apr 20 15:58:44 user nova-compute[71605]: athlon Apr 20 15:58:44 user nova-compute[71605]: Westmere-IBRS Apr 20 15:58:44 user nova-compute[71605]: Westmere Apr 20 15:58:44 user nova-compute[71605]: Snowridge Apr 20 15:58:44 user nova-compute[71605]: Skylake-Server-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Server-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Server Apr 20 15:58:44 user nova-compute[71605]: Skylake-Client-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Client-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Client Apr 20 15:58:44 user nova-compute[71605]: SandyBridge-IBRS Apr 20 15:58:44 user nova-compute[71605]: SandyBridge Apr 20 15:58:44 user nova-compute[71605]: Penryn Apr 20 15:58:44 user nova-compute[71605]: Opteron_G5 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G4 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G3 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G2 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G1 Apr 20 15:58:44 user nova-compute[71605]: Nehalem-IBRS Apr 20 15:58:44 user nova-compute[71605]: Nehalem Apr 20 15:58:44 user nova-compute[71605]: IvyBridge-IBRS Apr 20 15:58:44 user nova-compute[71605]: IvyBridge Apr 20 15:58:44 user nova-compute[71605]: Icelake-Server-noTSX Apr 20 15:58:44 user nova-compute[71605]: Icelake-Server Apr 20 15:58:44 user nova-compute[71605]: Icelake-Client-noTSX Apr 20 15:58:44 user nova-compute[71605]: Icelake-Client Apr 20 15:58:44 user nova-compute[71605]: Haswell-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Haswell-noTSX Apr 20 15:58:44 user nova-compute[71605]: Haswell-IBRS Apr 20 15:58:44 user nova-compute[71605]: Haswell Apr 20 15:58:44 user nova-compute[71605]: EPYC-Rome Apr 20 15:58:44 user nova-compute[71605]: EPYC-Milan Apr 20 15:58:44 user nova-compute[71605]: EPYC-IBPB Apr 20 15:58:44 user nova-compute[71605]: EPYC Apr 20 15:58:44 user nova-compute[71605]: Dhyana Apr 20 15:58:44 user nova-compute[71605]: Cooperlake Apr 20 15:58:44 user nova-compute[71605]: Conroe Apr 20 15:58:44 user nova-compute[71605]: Cascadelake-Server-noTSX Apr 20 15:58:44 user nova-compute[71605]: Cascadelake-Server Apr 20 15:58:44 user nova-compute[71605]: Broadwell-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Broadwell-noTSX Apr 20 15:58:44 user nova-compute[71605]: Broadwell-IBRS Apr 20 15:58:44 user nova-compute[71605]: Broadwell Apr 20 15:58:44 user nova-compute[71605]: 486 Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: file Apr 20 15:58:44 user nova-compute[71605]: anonymous Apr 20 15:58:44 user nova-compute[71605]: memfd Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: disk Apr 20 15:58:44 user nova-compute[71605]: cdrom Apr 20 15:58:44 user nova-compute[71605]: floppy Apr 20 15:58:44 user nova-compute[71605]: lun Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: ide Apr 20 15:58:44 user nova-compute[71605]: fdc Apr 20 15:58:44 user nova-compute[71605]: scsi Apr 20 15:58:44 user nova-compute[71605]: virtio Apr 20 15:58:44 user nova-compute[71605]: usb Apr 20 15:58:44 user nova-compute[71605]: sata Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: virtio Apr 20 15:58:44 user nova-compute[71605]: virtio-transitional Apr 20 15:58:44 user nova-compute[71605]: virtio-non-transitional Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: sdl Apr 20 15:58:44 user nova-compute[71605]: vnc Apr 20 15:58:44 user nova-compute[71605]: spice Apr 20 15:58:44 user nova-compute[71605]: egl-headless Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: subsystem Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: default Apr 20 15:58:44 user nova-compute[71605]: mandatory Apr 20 15:58:44 user nova-compute[71605]: requisite Apr 20 15:58:44 user nova-compute[71605]: optional Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: usb Apr 20 15:58:44 user nova-compute[71605]: pci Apr 20 15:58:44 user nova-compute[71605]: scsi Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: virtio Apr 20 15:58:44 user nova-compute[71605]: virtio-transitional Apr 20 15:58:44 user nova-compute[71605]: virtio-non-transitional Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: random Apr 20 15:58:44 user nova-compute[71605]: egd Apr 20 15:58:44 user nova-compute[71605]: builtin Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: path Apr 20 15:58:44 user nova-compute[71605]: handle Apr 20 15:58:44 user nova-compute[71605]: virtiofs Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: tpm-tis Apr 20 15:58:44 user nova-compute[71605]: tpm-crb Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: passthrough Apr 20 15:58:44 user nova-compute[71605]: emulator Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: {{(pid=71605) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: /usr/bin/qemu-system-x86_64 Apr 20 15:58:44 user nova-compute[71605]: kvm Apr 20 15:58:44 user nova-compute[71605]: pc-i440fx-6.2 Apr 20 15:58:44 user nova-compute[71605]: x86_64 Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: efi Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: rom Apr 20 15:58:44 user nova-compute[71605]: pflash Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: yes Apr 20 15:58:44 user nova-compute[71605]: no Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: no Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: on Apr 20 15:58:44 user nova-compute[71605]: off Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: on Apr 20 15:58:44 user nova-compute[71605]: off Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: IvyBridge-IBRS Apr 20 15:58:44 user nova-compute[71605]: Intel Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: qemu64 Apr 20 15:58:44 user nova-compute[71605]: qemu32 Apr 20 15:58:44 user nova-compute[71605]: phenom Apr 20 15:58:44 user nova-compute[71605]: pentium3 Apr 20 15:58:44 user nova-compute[71605]: pentium2 Apr 20 15:58:44 user nova-compute[71605]: pentium Apr 20 15:58:44 user nova-compute[71605]: n270 Apr 20 15:58:44 user nova-compute[71605]: kvm64 Apr 20 15:58:44 user nova-compute[71605]: kvm32 Apr 20 15:58:44 user nova-compute[71605]: coreduo Apr 20 15:58:44 user nova-compute[71605]: core2duo Apr 20 15:58:44 user nova-compute[71605]: athlon Apr 20 15:58:44 user nova-compute[71605]: Westmere-IBRS Apr 20 15:58:44 user nova-compute[71605]: Westmere Apr 20 15:58:44 user nova-compute[71605]: Snowridge Apr 20 15:58:44 user nova-compute[71605]: Skylake-Server-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Server-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Server Apr 20 15:58:44 user nova-compute[71605]: Skylake-Client-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Client-IBRS Apr 20 15:58:44 user nova-compute[71605]: Skylake-Client Apr 20 15:58:44 user nova-compute[71605]: SandyBridge-IBRS Apr 20 15:58:44 user nova-compute[71605]: SandyBridge Apr 20 15:58:44 user nova-compute[71605]: Penryn Apr 20 15:58:44 user nova-compute[71605]: Opteron_G5 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G4 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G3 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G2 Apr 20 15:58:44 user nova-compute[71605]: Opteron_G1 Apr 20 15:58:44 user nova-compute[71605]: Nehalem-IBRS Apr 20 15:58:44 user nova-compute[71605]: Nehalem Apr 20 15:58:44 user nova-compute[71605]: IvyBridge-IBRS Apr 20 15:58:44 user nova-compute[71605]: IvyBridge Apr 20 15:58:44 user nova-compute[71605]: Icelake-Server-noTSX Apr 20 15:58:44 user nova-compute[71605]: Icelake-Server Apr 20 15:58:44 user nova-compute[71605]: Icelake-Client-noTSX Apr 20 15:58:44 user nova-compute[71605]: Icelake-Client Apr 20 15:58:44 user nova-compute[71605]: Haswell-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Haswell-noTSX Apr 20 15:58:44 user nova-compute[71605]: Haswell-IBRS Apr 20 15:58:44 user nova-compute[71605]: Haswell Apr 20 15:58:44 user nova-compute[71605]: EPYC-Rome Apr 20 15:58:44 user nova-compute[71605]: EPYC-Milan Apr 20 15:58:44 user nova-compute[71605]: EPYC-IBPB Apr 20 15:58:44 user nova-compute[71605]: EPYC Apr 20 15:58:44 user nova-compute[71605]: Dhyana Apr 20 15:58:44 user nova-compute[71605]: Cooperlake Apr 20 15:58:44 user nova-compute[71605]: Conroe Apr 20 15:58:44 user nova-compute[71605]: Cascadelake-Server-noTSX Apr 20 15:58:44 user nova-compute[71605]: Cascadelake-Server Apr 20 15:58:44 user nova-compute[71605]: Broadwell-noTSX-IBRS Apr 20 15:58:44 user nova-compute[71605]: Broadwell-noTSX Apr 20 15:58:44 user nova-compute[71605]: Broadwell-IBRS Apr 20 15:58:44 user nova-compute[71605]: Broadwell Apr 20 15:58:44 user nova-compute[71605]: 486 Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: file Apr 20 15:58:44 user nova-compute[71605]: anonymous Apr 20 15:58:44 user nova-compute[71605]: memfd Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: disk Apr 20 15:58:44 user nova-compute[71605]: cdrom Apr 20 15:58:44 user nova-compute[71605]: floppy Apr 20 15:58:44 user nova-compute[71605]: lun Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: ide Apr 20 15:58:44 user nova-compute[71605]: fdc Apr 20 15:58:44 user nova-compute[71605]: scsi Apr 20 15:58:44 user nova-compute[71605]: virtio Apr 20 15:58:44 user nova-compute[71605]: usb Apr 20 15:58:44 user nova-compute[71605]: sata Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: virtio Apr 20 15:58:44 user nova-compute[71605]: virtio-transitional Apr 20 15:58:44 user nova-compute[71605]: virtio-non-transitional Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: sdl Apr 20 15:58:44 user nova-compute[71605]: vnc Apr 20 15:58:44 user nova-compute[71605]: spice Apr 20 15:58:44 user nova-compute[71605]: egl-headless Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: subsystem Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: default Apr 20 15:58:44 user nova-compute[71605]: mandatory Apr 20 15:58:44 user nova-compute[71605]: requisite Apr 20 15:58:44 user nova-compute[71605]: optional Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: usb Apr 20 15:58:44 user nova-compute[71605]: pci Apr 20 15:58:44 user nova-compute[71605]: scsi Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: virtio Apr 20 15:58:44 user nova-compute[71605]: virtio-transitional Apr 20 15:58:44 user nova-compute[71605]: virtio-non-transitional Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: random Apr 20 15:58:44 user nova-compute[71605]: egd Apr 20 15:58:44 user nova-compute[71605]: builtin Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: path Apr 20 15:58:44 user nova-compute[71605]: handle Apr 20 15:58:44 user nova-compute[71605]: virtiofs Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: tpm-tis Apr 20 15:58:44 user nova-compute[71605]: tpm-crb Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: passthrough Apr 20 15:58:44 user nova-compute[71605]: emulator Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: {{(pid=71605) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Getting domain capabilities for xtensa via machine types: {None} {{(pid=71605) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Error from libvirt when retrieving domain capabilities for arch xtensa / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-xtensa' on this host {{(pid=71605) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Getting domain capabilities for xtensaeb via machine types: {None} {{(pid=71605) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Error from libvirt when retrieving domain capabilities for arch xtensaeb / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-xtensaeb' on this host {{(pid=71605) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Checking secure boot support for host arch (x86_64) {{(pid=71605) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1750}} Apr 20 15:58:44 user nova-compute[71605]: INFO nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Secure Boot support detected Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] cpu compare xml: Apr 20 15:58:44 user nova-compute[71605]: Nehalem Apr 20 15:58:44 user nova-compute[71605]: Apr 20 15:58:44 user nova-compute[71605]: {{(pid=71605) _compare_cpu /opt/stack/nova/nova/virt/libvirt/driver.py:9996}} Apr 20 15:58:44 user nova-compute[71605]: INFO nova.virt.node [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Generated node identity 00e9f769-1a1c-4f1e-80e4-b19657803102 Apr 20 15:58:44 user nova-compute[71605]: INFO nova.virt.node [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Wrote node identity 00e9f769-1a1c-4f1e-80e4-b19657803102 to /opt/stack/data/nova/compute_id Apr 20 15:58:44 user nova-compute[71605]: WARNING nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Compute nodes ['00e9f769-1a1c-4f1e-80e4-b19657803102'] for host user were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. Apr 20 15:58:44 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host Apr 20 15:58:44 user nova-compute[71605]: WARNING nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] No compute node record found for host user. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host user could not be found. Apr 20 15:58:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 15:58:44 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 15:58:44 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 15:58:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Hypervisor/Node resource view: name=user free_ram=10787MB free_disk=26.833999633789062GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 15:58:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 15:58:44 user nova-compute[71605]: WARNING nova.compute.resource_tracker [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] No compute node record for user:00e9f769-1a1c-4f1e-80e4-b19657803102: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 00e9f769-1a1c-4f1e-80e4-b19657803102 could not be found. Apr 20 15:58:44 user nova-compute[71605]: INFO nova.compute.resource_tracker [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Compute node record created for user:user with uuid: 00e9f769-1a1c-4f1e-80e4-b19657803102 Apr 20 15:58:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 15:58:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 15:58:45 user nova-compute[71605]: INFO nova.scheduler.client.report [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [req-5ef7a3a2-e482-4cd9-83eb-9c13d8c16172] Created resource provider record via placement API for resource provider with UUID 00e9f769-1a1c-4f1e-80e4-b19657803102 and name user. Apr 20 15:58:45 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] /sys/module/kvm_amd/parameters/sev does not exist {{(pid=71605) _kernel_supports_amd_sev /opt/stack/nova/nova/virt/libvirt/host.py:1766}} Apr 20 15:58:45 user nova-compute[71605]: INFO nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] kernel doesn't support AMD SEV Apr 20 15:58:45 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Updating inventory in ProviderTree for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 with inventory: {'MEMORY_MB': {'total': 16023, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 12, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 40, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 0}} {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 20 15:58:45 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71605) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 15:58:45 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Libvirt baseline CPU Apr 20 15:58:45 user nova-compute[71605]: x86_64 Apr 20 15:58:45 user nova-compute[71605]: Nehalem Apr 20 15:58:45 user nova-compute[71605]: Intel Apr 20 15:58:45 user nova-compute[71605]: Apr 20 15:58:45 user nova-compute[71605]: Apr 20 15:58:45 user nova-compute[71605]: {{(pid=71605) _get_guest_baseline_cpu_features /opt/stack/nova/nova/virt/libvirt/driver.py:12486}} Apr 20 15:58:45 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Updated inventory for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 16023, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 12, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 40, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} Apr 20 15:58:45 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Updating resource provider 00e9f769-1a1c-4f1e-80e4-b19657803102 generation from 0 to 1 during operation: update_inventory {{(pid=71605) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} Apr 20 15:58:45 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Updating inventory in ProviderTree for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 with inventory: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 20 15:58:45 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Updating resource provider 00e9f769-1a1c-4f1e-80e4-b19657803102 generation from 1 to 2 during operation: update_traits {{(pid=71605) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} Apr 20 15:58:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 15:58:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.618s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 15:58:45 user nova-compute[71605]: DEBUG nova.service [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Creating RPC server for service compute {{(pid=71605) start /opt/stack/nova/nova/service.py:182}} Apr 20 15:58:45 user nova-compute[71605]: DEBUG nova.service [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Join ServiceGroup membership for this service compute {{(pid=71605) start /opt/stack/nova/nova/service.py:199}} Apr 20 15:58:45 user nova-compute[71605]: DEBUG nova.servicegroup.drivers.db [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] DB_Driver: join new ServiceGroup member user to the compute group, service = {{(pid=71605) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} Apr 20 15:58:57 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._sync_power_states {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 15:58:57 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 15:59:36 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 15:59:36 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 15:59:36 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 15:59:36 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Rebuilding the list of instances to heal {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 15:59:36 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Didn't find any instances for network info cache update. {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 20 15:59:36 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 15:59:36 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 15:59:36 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 15:59:36 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 15:59:36 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 15:59:36 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 15:59:36 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 15:59:36 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 15:59:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 15:59:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 15:59:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 15:59:36 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 15:59:36 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 15:59:36 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 15:59:36 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=10187MB free_disk=26.74783706665039GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 15:59:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 15:59:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 15:59:36 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 15:59:36 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 15:59:36 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 15:59:36 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 15:59:36 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 15:59:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.134s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:00:36 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:00:36 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:00:36 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:00:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:00:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:00:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:00:36 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:00:37 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:00:37 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:00:37 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=10164MB free_disk=26.792823791503906GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:00:37 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:00:37 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:00:37 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:00:37 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:00:37 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:00:37 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:00:37 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:00:37 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.143s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:00:37 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:00:37 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:00:37 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:00:37 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:00:38 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:00:38 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:00:38 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:00:38 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Rebuilding the list of instances to heal {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 16:00:38 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Didn't find any instances for network info cache update. {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 20 16:00:38 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:00:38 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:01:37 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:01:38 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:01:38 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:01:38 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:01:38 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:01:38 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:01:38 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:01:38 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:01:38 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:01:38 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:01:38 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:01:38 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:01:38 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:01:38 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:01:38 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:01:38 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=10177MB free_disk=26.569931030273438GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:01:38 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:01:38 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:01:38 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:01:38 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:01:38 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:01:38 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:01:38 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:01:38 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.140s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:01:39 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:01:39 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:01:39 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Rebuilding the list of instances to heal {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 16:01:39 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Didn't find any instances for network info cache update. {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 20 16:02:38 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:02:39 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:02:39 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:02:39 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:02:39 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:02:39 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:02:39 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:02:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:02:39 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:02:39 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:02:39 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:02:39 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=9449MB free_disk=26.59194564819336GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:02:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:39 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:02:39 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:02:39 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:02:39 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:02:39 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:02:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.113s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:02:40 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:02:40 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:02:40 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:02:41 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:02:41 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:02:41 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Rebuilding the list of instances to heal {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 16:02:41 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Didn't find any instances for network info cache update. {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 20 16:02:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquiring lock "a5e68386-3b32-458b-9808-797d041c2235" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "a5e68386-3b32-458b-9808-797d041c2235" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:52 user nova-compute[71605]: DEBUG nova.compute.manager [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Starting instance... {{(pid=71605) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 16:02:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:52 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71605) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 16:02:52 user nova-compute[71605]: INFO nova.compute.claims [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Claim successful on node user Apr 20 16:02:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Acquiring lock "d4ea4d29-b178-4da2-b971-76f97031b244" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "d4ea4d29-b178-4da2-b971-76f97031b244" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:52 user nova-compute[71605]: DEBUG nova.compute.manager [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Starting instance... {{(pid=71605) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 16:02:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Acquiring lock "6d55e5bd-9b03-40a9-bca9-88545039597c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Lock "6d55e5bd-9b03-40a9-bca9-88545039597c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:52 user nova-compute[71605]: DEBUG nova.compute.manager [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Starting instance... {{(pid=71605) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 16:02:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:52 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:02:52 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:02:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.415s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:02:52 user nova-compute[71605]: DEBUG nova.compute.manager [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Start building networks asynchronously for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 16:02:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.158s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:52 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71605) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 16:02:52 user nova-compute[71605]: INFO nova.compute.claims [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Claim successful on node user Apr 20 16:02:52 user nova-compute[71605]: DEBUG nova.compute.manager [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Allocating IP information in the background. {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 16:02:52 user nova-compute[71605]: DEBUG nova.network.neutron [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] allocate_for_instance() {{(pid=71605) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 16:02:52 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 16:02:52 user nova-compute[71605]: DEBUG nova.compute.manager [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Start building block device mappings for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 16:02:52 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.407s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG nova.compute.manager [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Start building networks asynchronously for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.511s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71605) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 16:02:53 user nova-compute[71605]: INFO nova.compute.claims [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Claim successful on node user Apr 20 16:02:53 user nova-compute[71605]: DEBUG nova.compute.manager [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Allocating IP information in the background. {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG nova.network.neutron [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] allocate_for_instance() {{(pid=71605) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG nova.compute.manager [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Start spawning the instance on the hypervisor. {{(pid=71605) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Creating instance directory {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 16:02:53 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Creating image(s) Apr 20 16:02:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquiring lock "/opt/stack/data/nova/instances/a5e68386-3b32-458b-9808-797d041c2235/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "/opt/stack/data/nova/instances/a5e68386-3b32-458b-9808-797d041c2235/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "/opt/stack/data/nova/instances/a5e68386-3b32-458b-9808-797d041c2235/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquiring lock "4030659dc9e6940e4f224066d06e3784b1229890" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:53 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 16:02:53 user nova-compute[71605]: DEBUG nova.compute.manager [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Start building block device mappings for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG nova.compute.manager [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Start spawning the instance on the hypervisor. {{(pid=71605) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Creating instance directory {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 16:02:53 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Creating image(s) Apr 20 16:02:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Acquiring lock "/opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "/opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "/opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.014s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Acquiring lock "4030659dc9e6940e4f224066d06e3784b1229890" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Acquiring lock "91f4b3d1-0fea-4378-94e3-c2bbfd8cad81" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Lock "91f4b3d1-0fea-4378-94e3-c2bbfd8cad81" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.373s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG nova.compute.manager [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Start building networks asynchronously for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG nova.compute.manager [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Starting instance... {{(pid=71605) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890.part --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG nova.policy [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '690c49feae904687826fb959ba5ba283', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '71cf2664111f45788d24092e8ceede9c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71605) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG nova.policy [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9be25e958c6047068ab5ce63106b0754', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd8444d3c8f554a56967917670b19dc37', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71605) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG nova.compute.manager [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Allocating IP information in the background. {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG nova.network.neutron [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] allocate_for_instance() {{(pid=71605) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 16:02:53 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 16:02:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71605) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 16:02:53 user nova-compute[71605]: INFO nova.compute.claims [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Claim successful on node user Apr 20 16:02:53 user nova-compute[71605]: DEBUG nova.compute.manager [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Start building block device mappings for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890.part --force-share --output=json" returned: 0 in 0.154s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG nova.virt.images [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] 4ac69ea5-e5d7-40c8-864e-0a164d78a727 was qcow2, converting to raw {{(pid=71605) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG nova.privsep.utils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71605) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 20 16:02:53 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890.part /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890.converted {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:02:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Start spawning the instance on the hypervisor. {{(pid=71605) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 16:02:54 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Creating instance directory {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 16:02:54 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Creating image(s) Apr 20 16:02:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Acquiring lock "/opt/stack/data/nova/instances/6d55e5bd-9b03-40a9-bca9-88545039597c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Lock "/opt/stack/data/nova/instances/6d55e5bd-9b03-40a9-bca9-88545039597c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Lock "/opt/stack/data/nova/instances/6d55e5bd-9b03-40a9-bca9-88545039597c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:02:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Acquiring lock "4030659dc9e6940e4f224066d06e3784b1229890" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:54 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890.part /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890.converted" returned: 0 in 0.136s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:02:54 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890.converted --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:02:54 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890.converted --force-share --output=json" returned: 0 in 0.142s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:02:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 1.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:02:54 user nova-compute[71605]: INFO oslo.privsep.daemon [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova-cpu.conf', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpcjg8ixhw/privsep.sock'] Apr 20 16:02:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.757s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:02:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.174s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:02:54 user sudo[80339]: stack : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova-cpu.conf --privsep_context nova.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpcjg8ixhw/privsep.sock Apr 20 16:02:54 user sudo[80339]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1001) Apr 20 16:02:54 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:02:54 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:02:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.461s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:02:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Start building networks asynchronously for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 16:02:54 user nova-compute[71605]: DEBUG nova.policy [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8a7606e886554ff7948a4e246dd98677', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3336309776d848efaf237863a5b9bfeb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71605) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 16:02:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Allocating IP information in the background. {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 16:02:54 user nova-compute[71605]: DEBUG nova.network.neutron [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] allocate_for_instance() {{(pid=71605) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 16:02:54 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 16:02:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Start building block device mappings for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 16:02:54 user nova-compute[71605]: DEBUG nova.policy [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '48eeb9edc18f48f0ad13c819cdac9106', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fbbcfeb5266f4ca6b9738b18ba7d127e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71605) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 16:02:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Start spawning the instance on the hypervisor. {{(pid=71605) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 16:02:54 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Creating instance directory {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 16:02:54 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Creating image(s) Apr 20 16:02:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Acquiring lock "/opt/stack/data/nova/instances/91f4b3d1-0fea-4378-94e3-c2bbfd8cad81/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Lock "/opt/stack/data/nova/instances/91f4b3d1-0fea-4378-94e3-c2bbfd8cad81/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Lock "/opt/stack/data/nova/instances/91f4b3d1-0fea-4378-94e3-c2bbfd8cad81/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:02:55 user sudo[80339]: pam_unix(sudo:session): session closed for user root Apr 20 16:02:55 user nova-compute[71605]: INFO oslo.privsep.daemon [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Spawned new privsep daemon via rootwrap Apr 20 16:02:55 user nova-compute[71605]: INFO oslo.privsep.daemon [-] privsep daemon starting Apr 20 16:02:55 user nova-compute[71605]: INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 Apr 20 16:02:55 user nova-compute[71605]: INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none Apr 20 16:02:55 user nova-compute[71605]: INFO oslo.privsep.daemon [-] privsep daemon running as pid 80342 Apr 20 16:02:55 user nova-compute[71605]: WARNING oslo_privsep.priv_context [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] privsep daemon already running Apr 20 16:02:55 user nova-compute[71605]: WARNING oslo_privsep.priv_context [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] privsep daemon already running Apr 20 16:02:55 user nova-compute[71605]: WARNING oslo_privsep.priv_context [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] privsep daemon already running Apr 20 16:02:55 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:02:55 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:02:55 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:02:55 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.128s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Acquiring lock "4030659dc9e6940e4f224066d06e3784b1229890" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.144s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Acquiring lock "4030659dc9e6940e4f224066d06e3784b1229890" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG nova.network.neutron [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Successfully created port: 0b36b1a4-9ab6-49cb-9a5e-afc32792783e {{(pid=71605) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Acquiring lock "5bda996a-1bfe-4f43-aa02-36a864153588" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "5bda996a-1bfe-4f43-aa02-36a864153588" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.154s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquiring lock "4030659dc9e6940e4f224066d06e3784b1229890" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG nova.network.neutron [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Successfully created port: 4bce4922-407c-4e11-b089-154a3299ea1c {{(pid=71605) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.138s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Acquiring lock "4030659dc9e6940e4f224066d06e3784b1229890" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG nova.compute.manager [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Starting instance... {{(pid=71605) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.139s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/6d55e5bd-9b03-40a9-bca9-88545039597c/disk 1073741824 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/6d55e5bd-9b03-40a9-bca9-88545039597c/disk 1073741824" returned: 0 in 0.048s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.198s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.197s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71605) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 16:02:56 user nova-compute[71605]: INFO nova.compute.claims [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Claim successful on node user Apr 20 16:02:56 user nova-compute[71605]: DEBUG nova.network.neutron [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Successfully created port: b2af67f0-0768-4ebc-a21b-0ef6e2b3f264 {{(pid=71605) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.166s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Checking if we can resize image /opt/stack/data/nova/instances/6d55e5bd-9b03-40a9-bca9-88545039597c/disk. size=1073741824 {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6d55e5bd-9b03-40a9-bca9-88545039597c/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.171s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk 1073741824 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk 1073741824" returned: 0 in 0.055s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.232s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.425s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6d55e5bd-9b03-40a9-bca9-88545039597c/disk --force-share --output=json" returned: 0 in 0.169s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Cannot resize image /opt/stack/data/nova/instances/6d55e5bd-9b03-40a9-bca9-88545039597c/disk to a smaller size. {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG nova.objects.instance [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Lazy-loading 'migration_context' on Instance uuid 6d55e5bd-9b03-40a9-bca9-88545039597c {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Created local disks {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Ensure instance console log exists: /opt/stack/data/nova/instances/6d55e5bd-9b03-40a9-bca9-88545039597c/console.log {{(pid=71605) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.129s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Checking if we can resize image /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk. size=1073741824 {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.130s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/a5e68386-3b32-458b-9808-797d041c2235/disk 1073741824 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/a5e68386-3b32-458b-9808-797d041c2235/disk 1073741824" returned: 0 in 0.048s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.183s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG nova.network.neutron [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Successfully created port: fe98bff4-7b0f-4244-a254-fc9359c00aae {{(pid=71605) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.633s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json" returned: 0 in 0.180s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Cannot resize image /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk to a smaller size. {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG nova.objects.instance [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lazy-loading 'migration_context' on Instance uuid d4ea4d29-b178-4da2-b971-76f97031b244 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Created local disks {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Ensure instance console log exists: /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/console.log {{(pid=71605) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.138s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Checking if we can resize image /opt/stack/data/nova/instances/a5e68386-3b32-458b-9808-797d041c2235/disk. size=1073741824 {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a5e68386-3b32-458b-9808-797d041c2235/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.589s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG nova.compute.manager [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Start building networks asynchronously for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.152s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/91f4b3d1-0fea-4378-94e3-c2bbfd8cad81/disk 1073741824 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a5e68386-3b32-458b-9808-797d041c2235/disk --force-share --output=json" returned: 0 in 0.127s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Cannot resize image /opt/stack/data/nova/instances/a5e68386-3b32-458b-9808-797d041c2235/disk to a smaller size. {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 16:02:56 user nova-compute[71605]: DEBUG nova.objects.instance [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lazy-loading 'migration_context' on Instance uuid a5e68386-3b32-458b-9808-797d041c2235 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/91f4b3d1-0fea-4378-94e3-c2bbfd8cad81/disk 1073741824" returned: 0 in 0.090s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Created local disks {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Ensure instance console log exists: /opt/stack/data/nova/instances/a5e68386-3b32-458b-9808-797d041c2235/console.log {{(pid=71605) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG nova.compute.manager [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Allocating IP information in the background. {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG nova.network.neutron [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] allocate_for_instance() {{(pid=71605) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.254s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:02:57 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 16:02:57 user nova-compute[71605]: DEBUG nova.compute.manager [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Start building block device mappings for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.150s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Checking if we can resize image /opt/stack/data/nova/instances/91f4b3d1-0fea-4378-94e3-c2bbfd8cad81/disk. size=1073741824 {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91f4b3d1-0fea-4378-94e3-c2bbfd8cad81/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG nova.compute.manager [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Start spawning the instance on the hypervisor. {{(pid=71605) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Creating instance directory {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 16:02:57 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Creating image(s) Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Acquiring lock "/opt/stack/data/nova/instances/5bda996a-1bfe-4f43-aa02-36a864153588/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "/opt/stack/data/nova/instances/5bda996a-1bfe-4f43-aa02-36a864153588/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "/opt/stack/data/nova/instances/5bda996a-1bfe-4f43-aa02-36a864153588/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91f4b3d1-0fea-4378-94e3-c2bbfd8cad81/disk --force-share --output=json" returned: 0 in 0.124s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Cannot resize image /opt/stack/data/nova/instances/91f4b3d1-0fea-4378-94e3-c2bbfd8cad81/disk to a smaller size. {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG nova.objects.instance [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Lazy-loading 'migration_context' on Instance uuid 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Created local disks {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Ensure instance console log exists: /opt/stack/data/nova/instances/91f4b3d1-0fea-4378-94e3-c2bbfd8cad81/console.log {{(pid=71605) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.130s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Acquiring lock "4030659dc9e6940e4f224066d06e3784b1229890" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG nova.policy [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f50dbce30f294bb0ba6bc2811025835d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cb0a5eb3796a4d3a871843f409c6ffbd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71605) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.155s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/5bda996a-1bfe-4f43-aa02-36a864153588/disk 1073741824 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/5bda996a-1bfe-4f43-aa02-36a864153588/disk 1073741824" returned: 0 in 0.047s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.207s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.124s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Checking if we can resize image /opt/stack/data/nova/instances/5bda996a-1bfe-4f43-aa02-36a864153588/disk. size=1073741824 {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5bda996a-1bfe-4f43-aa02-36a864153588/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5bda996a-1bfe-4f43-aa02-36a864153588/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Cannot resize image /opt/stack/data/nova/instances/5bda996a-1bfe-4f43-aa02-36a864153588/disk to a smaller size. {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG nova.objects.instance [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lazy-loading 'migration_context' on Instance uuid 5bda996a-1bfe-4f43-aa02-36a864153588 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Created local disks {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Ensure instance console log exists: /opt/stack/data/nova/instances/5bda996a-1bfe-4f43-aa02-36a864153588/console.log {{(pid=71605) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:02:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:02:59 user nova-compute[71605]: DEBUG nova.network.neutron [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Successfully updated port: b2af67f0-0768-4ebc-a21b-0ef6e2b3f264 {{(pid=71605) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 16:02:59 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Acquiring lock "refresh_cache-91f4b3d1-0fea-4378-94e3-c2bbfd8cad81" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:02:59 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Acquired lock "refresh_cache-91f4b3d1-0fea-4378-94e3-c2bbfd8cad81" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:02:59 user nova-compute[71605]: DEBUG nova.network.neutron [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Building network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 16:02:59 user nova-compute[71605]: DEBUG nova.network.neutron [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Instance cache missing network info. {{(pid=71605) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 16:02:59 user nova-compute[71605]: DEBUG nova.compute.manager [req-bf0d0ffd-6cf2-4ce8-8168-81ebf605576d req-24921570-59be-46fc-9589-76a3cece8581 service nova] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Received event network-changed-b2af67f0-0768-4ebc-a21b-0ef6e2b3f264 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:02:59 user nova-compute[71605]: DEBUG nova.compute.manager [req-bf0d0ffd-6cf2-4ce8-8168-81ebf605576d req-24921570-59be-46fc-9589-76a3cece8581 service nova] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Refreshing instance network info cache due to event network-changed-b2af67f0-0768-4ebc-a21b-0ef6e2b3f264. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:02:59 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-bf0d0ffd-6cf2-4ce8-8168-81ebf605576d req-24921570-59be-46fc-9589-76a3cece8581 service nova] Acquiring lock "refresh_cache-91f4b3d1-0fea-4378-94e3-c2bbfd8cad81" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:02:59 user nova-compute[71605]: DEBUG nova.network.neutron [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Successfully updated port: fe98bff4-7b0f-4244-a254-fc9359c00aae {{(pid=71605) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 16:02:59 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Acquiring lock "refresh_cache-6d55e5bd-9b03-40a9-bca9-88545039597c" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:02:59 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Acquired lock "refresh_cache-6d55e5bd-9b03-40a9-bca9-88545039597c" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:02:59 user nova-compute[71605]: DEBUG nova.network.neutron [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Building network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 16:02:59 user nova-compute[71605]: DEBUG nova.network.neutron [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Successfully updated port: 0b36b1a4-9ab6-49cb-9a5e-afc32792783e {{(pid=71605) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 16:02:59 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Acquiring lock "refresh_cache-d4ea4d29-b178-4da2-b971-76f97031b244" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:02:59 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Acquired lock "refresh_cache-d4ea4d29-b178-4da2-b971-76f97031b244" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:02:59 user nova-compute[71605]: DEBUG nova.network.neutron [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Building network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 16:02:59 user nova-compute[71605]: DEBUG nova.network.neutron [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Successfully updated port: 4bce4922-407c-4e11-b089-154a3299ea1c {{(pid=71605) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 16:02:59 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquiring lock "refresh_cache-a5e68386-3b32-458b-9808-797d041c2235" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:02:59 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquired lock "refresh_cache-a5e68386-3b32-458b-9808-797d041c2235" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:02:59 user nova-compute[71605]: DEBUG nova.network.neutron [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Building network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 16:02:59 user nova-compute[71605]: DEBUG nova.network.neutron [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Instance cache missing network info. {{(pid=71605) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 16:02:59 user nova-compute[71605]: DEBUG nova.compute.manager [req-c4ebbf9c-17e1-464f-9452-bef2cbde2973 req-3bb9e808-c0e6-4398-a985-afdaf90e98a7 service nova] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Received event network-changed-fe98bff4-7b0f-4244-a254-fc9359c00aae {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:02:59 user nova-compute[71605]: DEBUG nova.compute.manager [req-c4ebbf9c-17e1-464f-9452-bef2cbde2973 req-3bb9e808-c0e6-4398-a985-afdaf90e98a7 service nova] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Refreshing instance network info cache due to event network-changed-fe98bff4-7b0f-4244-a254-fc9359c00aae. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:02:59 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-c4ebbf9c-17e1-464f-9452-bef2cbde2973 req-3bb9e808-c0e6-4398-a985-afdaf90e98a7 service nova] Acquiring lock "refresh_cache-6d55e5bd-9b03-40a9-bca9-88545039597c" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:02:59 user nova-compute[71605]: DEBUG nova.compute.manager [req-8f934614-8628-4417-b8a1-e7214b266c16 req-8c7a3dc9-137c-405b-85b1-ed26e3a5218f service nova] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Received event network-changed-0b36b1a4-9ab6-49cb-9a5e-afc32792783e {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:02:59 user nova-compute[71605]: DEBUG nova.compute.manager [req-8f934614-8628-4417-b8a1-e7214b266c16 req-8c7a3dc9-137c-405b-85b1-ed26e3a5218f service nova] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Refreshing instance network info cache due to event network-changed-0b36b1a4-9ab6-49cb-9a5e-afc32792783e. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:02:59 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-8f934614-8628-4417-b8a1-e7214b266c16 req-8c7a3dc9-137c-405b-85b1-ed26e3a5218f service nova] Acquiring lock "refresh_cache-d4ea4d29-b178-4da2-b971-76f97031b244" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.network.neutron [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Instance cache missing network info. {{(pid=71605) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.network.neutron [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Instance cache missing network info. {{(pid=71605) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.network.neutron [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Updating instance_info_cache with network_info: [{"id": "b2af67f0-0768-4ebc-a21b-0ef6e2b3f264", "address": "fa:16:3e:d0:3f:7b", "network": {"id": "224391e3-9d6f-4e5f-b1bb-00dd1cd0ea06", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1568684394-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fbbcfeb5266f4ca6b9738b18ba7d127e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2af67f0-07", "ovs_interfaceid": "b2af67f0-0768-4ebc-a21b-0ef6e2b3f264", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Releasing lock "refresh_cache-91f4b3d1-0fea-4378-94e3-c2bbfd8cad81" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.compute.manager [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Instance network_info: |[{"id": "b2af67f0-0768-4ebc-a21b-0ef6e2b3f264", "address": "fa:16:3e:d0:3f:7b", "network": {"id": "224391e3-9d6f-4e5f-b1bb-00dd1cd0ea06", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1568684394-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fbbcfeb5266f4ca6b9738b18ba7d127e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2af67f0-07", "ovs_interfaceid": "b2af67f0-0768-4ebc-a21b-0ef6e2b3f264", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-bf0d0ffd-6cf2-4ce8-8168-81ebf605576d req-24921570-59be-46fc-9589-76a3cece8581 service nova] Acquired lock "refresh_cache-91f4b3d1-0fea-4378-94e3-c2bbfd8cad81" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.network.neutron [req-bf0d0ffd-6cf2-4ce8-8168-81ebf605576d req-24921570-59be-46fc-9589-76a3cece8581 service nova] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Refreshing network info cache for port b2af67f0-0768-4ebc-a21b-0ef6e2b3f264 {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Start _get_guest_xml network_info=[{"id": "b2af67f0-0768-4ebc-a21b-0ef6e2b3f264", "address": "fa:16:3e:d0:3f:7b", "network": {"id": "224391e3-9d6f-4e5f-b1bb-00dd1cd0ea06", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1568684394-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fbbcfeb5266f4ca6b9738b18ba7d127e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2af67f0-07", "ovs_interfaceid": "b2af67f0-0768-4ebc-a21b-0ef6e2b3f264", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'size': 0, 'encryption_format': None, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'image_id': '4ac69ea5-e5d7-40c8-864e-0a164d78a727'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 16:03:00 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:03:00 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71605) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T16:00:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=), allow threads: True {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Flavor limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Image limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Flavor pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Image pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Got 1 possible topologies {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.privsep.utils [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71605) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:02:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-942605486',display_name='tempest-ServerStableDeviceRescueTest-server-942605486',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverstabledevicerescuetest-server-942605486',id=4,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCZym0vodVYP/JGy/H71EPtkiLL3pgSyqc+6Le0y9dituQzc/wfdGdwVdf4pgjAAE55MUTGyqTl0C2t2y934ULtfkrcvhTphaGXzfELzex4GVcPZlULQOFRqsadQFb89Hw==',key_name='tempest-keypair-58929071',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fbbcfeb5266f4ca6b9738b18ba7d127e',ramdisk_id='',reservation_id='r-zbg0o0ba',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-179851846',owner_user_name='tempest-ServerStableDeviceRescueTest-179851846-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:02:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='48eeb9edc18f48f0ad13c819cdac9106',uuid=91f4b3d1-0fea-4378-94e3-c2bbfd8cad81,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b2af67f0-0768-4ebc-a21b-0ef6e2b3f264", "address": "fa:16:3e:d0:3f:7b", "network": {"id": "224391e3-9d6f-4e5f-b1bb-00dd1cd0ea06", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1568684394-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fbbcfeb5266f4ca6b9738b18ba7d127e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2af67f0-07", "ovs_interfaceid": "b2af67f0-0768-4ebc-a21b-0ef6e2b3f264", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71605) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Converting VIF {"id": "b2af67f0-0768-4ebc-a21b-0ef6e2b3f264", "address": "fa:16:3e:d0:3f:7b", "network": {"id": "224391e3-9d6f-4e5f-b1bb-00dd1cd0ea06", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1568684394-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fbbcfeb5266f4ca6b9738b18ba7d127e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2af67f0-07", "ovs_interfaceid": "b2af67f0-0768-4ebc-a21b-0ef6e2b3f264", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:3f:7b,bridge_name='br-int',has_traffic_filtering=True,id=b2af67f0-0768-4ebc-a21b-0ef6e2b3f264,network=Network(224391e3-9d6f-4e5f-b1bb-00dd1cd0ea06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2af67f0-07') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.objects.instance [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Lazy-loading 'pci_devices' on Instance uuid 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] End _get_guest_xml xml= Apr 20 16:03:00 user nova-compute[71605]: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81 Apr 20 16:03:00 user nova-compute[71605]: instance-00000004 Apr 20 16:03:00 user nova-compute[71605]: 131072 Apr 20 16:03:00 user nova-compute[71605]: 1 Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: tempest-ServerStableDeviceRescueTest-server-942605486 Apr 20 16:03:00 user nova-compute[71605]: 2023-04-20 16:03:00 Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: 128 Apr 20 16:03:00 user nova-compute[71605]: 1 Apr 20 16:03:00 user nova-compute[71605]: 0 Apr 20 16:03:00 user nova-compute[71605]: 0 Apr 20 16:03:00 user nova-compute[71605]: 1 Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: tempest-ServerStableDeviceRescueTest-179851846-project-member Apr 20 16:03:00 user nova-compute[71605]: tempest-ServerStableDeviceRescueTest-179851846 Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: OpenStack Foundation Apr 20 16:03:00 user nova-compute[71605]: OpenStack Nova Apr 20 16:03:00 user nova-compute[71605]: 0.0.0 Apr 20 16:03:00 user nova-compute[71605]: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81 Apr 20 16:03:00 user nova-compute[71605]: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81 Apr 20 16:03:00 user nova-compute[71605]: Virtual Machine Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: hvm Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Nehalem Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: /dev/urandom Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:02:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-942605486',display_name='tempest-ServerStableDeviceRescueTest-server-942605486',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverstabledevicerescuetest-server-942605486',id=4,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCZym0vodVYP/JGy/H71EPtkiLL3pgSyqc+6Le0y9dituQzc/wfdGdwVdf4pgjAAE55MUTGyqTl0C2t2y934ULtfkrcvhTphaGXzfELzex4GVcPZlULQOFRqsadQFb89Hw==',key_name='tempest-keypair-58929071',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fbbcfeb5266f4ca6b9738b18ba7d127e',ramdisk_id='',reservation_id='r-zbg0o0ba',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-179851846',owner_user_name='tempest-ServerStableDeviceRescueTest-179851846-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:02:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='48eeb9edc18f48f0ad13c819cdac9106',uuid=91f4b3d1-0fea-4378-94e3-c2bbfd8cad81,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b2af67f0-0768-4ebc-a21b-0ef6e2b3f264", "address": "fa:16:3e:d0:3f:7b", "network": {"id": "224391e3-9d6f-4e5f-b1bb-00dd1cd0ea06", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1568684394-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fbbcfeb5266f4ca6b9738b18ba7d127e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2af67f0-07", "ovs_interfaceid": "b2af67f0-0768-4ebc-a21b-0ef6e2b3f264", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Converting VIF {"id": "b2af67f0-0768-4ebc-a21b-0ef6e2b3f264", "address": "fa:16:3e:d0:3f:7b", "network": {"id": "224391e3-9d6f-4e5f-b1bb-00dd1cd0ea06", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1568684394-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fbbcfeb5266f4ca6b9738b18ba7d127e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2af67f0-07", "ovs_interfaceid": "b2af67f0-0768-4ebc-a21b-0ef6e2b3f264", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:3f:7b,bridge_name='br-int',has_traffic_filtering=True,id=b2af67f0-0768-4ebc-a21b-0ef6e2b3f264,network=Network(224391e3-9d6f-4e5f-b1bb-00dd1cd0ea06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2af67f0-07') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG os_vif [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:3f:7b,bridge_name='br-int',has_traffic_filtering=True,id=b2af67f0-0768-4ebc-a21b-0ef6e2b3f264,network=Network(224391e3-9d6f-4e5f-b1bb-00dd1cd0ea06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2af67f0-07') {{(pid=71605) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.network.neutron [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Updating instance_info_cache with network_info: [{"id": "0b36b1a4-9ab6-49cb-9a5e-afc32792783e", "address": "fa:16:3e:44:d8:d0", "network": {"id": "c36830a6-66f7-4f28-8879-e228da46cead", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-655574662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8444d3c8f554a56967917670b19dc37", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b36b1a4-9a", "ovs_interfaceid": "0b36b1a4-9ab6-49cb-9a5e-afc32792783e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Created schema index Interface.name {{(pid=71605) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Created schema index Port.name {{(pid=71605) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Created schema index Bridge.name {{(pid=71605) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] tcp:127.0.0.1:6640: entering CONNECTING {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [POLLOUT] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Releasing lock "refresh_cache-d4ea4d29-b178-4da2-b971-76f97031b244" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.compute.manager [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Instance network_info: |[{"id": "0b36b1a4-9ab6-49cb-9a5e-afc32792783e", "address": "fa:16:3e:44:d8:d0", "network": {"id": "c36830a6-66f7-4f28-8879-e228da46cead", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-655574662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8444d3c8f554a56967917670b19dc37", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b36b1a4-9a", "ovs_interfaceid": "0b36b1a4-9ab6-49cb-9a5e-afc32792783e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-8f934614-8628-4417-b8a1-e7214b266c16 req-8c7a3dc9-137c-405b-85b1-ed26e3a5218f service nova] Acquired lock "refresh_cache-d4ea4d29-b178-4da2-b971-76f97031b244" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.network.neutron [req-8f934614-8628-4417-b8a1-e7214b266c16 req-8c7a3dc9-137c-405b-85b1-ed26e3a5218f service nova] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Refreshing network info cache for port 0b36b1a4-9ab6-49cb-9a5e-afc32792783e {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Start _get_guest_xml network_info=[{"id": "0b36b1a4-9ab6-49cb-9a5e-afc32792783e", "address": "fa:16:3e:44:d8:d0", "network": {"id": "c36830a6-66f7-4f28-8879-e228da46cead", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-655574662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8444d3c8f554a56967917670b19dc37", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b36b1a4-9a", "ovs_interfaceid": "0b36b1a4-9ab6-49cb-9a5e-afc32792783e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'size': 0, 'encryption_format': None, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'image_id': '4ac69ea5-e5d7-40c8-864e-0a164d78a727'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 16:03:00 user nova-compute[71605]: INFO oslo.privsep.daemon [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova-cpu.conf', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpo2ecrdwa/privsep.sock'] Apr 20 16:03:00 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:03:00 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71605) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T16:00:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=), allow threads: True {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Flavor limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Image limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Flavor pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Image pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 16:03:00 user sudo[80426]: stack : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova-cpu.conf --privsep_context vif_plug_ovs.privsep.vif_plug --privsep_sock_path /tmp/tmpo2ecrdwa/privsep.sock Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Got 1 possible topologies {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 16:03:00 user sudo[80426]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1001) Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:02:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1335393395',display_name='tempest-ServersNegativeTestJSON-server-1335393395',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1335393395',id=2,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d8444d3c8f554a56967917670b19dc37',ramdisk_id='',reservation_id='r-955d2plh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-942369263',owner_user_name='tempest-ServersNegativeTestJSON-942369263-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:02:53Z,user_data=None,user_id='9be25e958c6047068ab5ce63106b0754',uuid=d4ea4d29-b178-4da2-b971-76f97031b244,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b36b1a4-9ab6-49cb-9a5e-afc32792783e", "address": "fa:16:3e:44:d8:d0", "network": {"id": "c36830a6-66f7-4f28-8879-e228da46cead", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-655574662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8444d3c8f554a56967917670b19dc37", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b36b1a4-9a", "ovs_interfaceid": "0b36b1a4-9ab6-49cb-9a5e-afc32792783e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71605) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Converting VIF {"id": "0b36b1a4-9ab6-49cb-9a5e-afc32792783e", "address": "fa:16:3e:44:d8:d0", "network": {"id": "c36830a6-66f7-4f28-8879-e228da46cead", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-655574662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8444d3c8f554a56967917670b19dc37", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b36b1a4-9a", "ovs_interfaceid": "0b36b1a4-9ab6-49cb-9a5e-afc32792783e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:d8:d0,bridge_name='br-int',has_traffic_filtering=True,id=0b36b1a4-9ab6-49cb-9a5e-afc32792783e,network=Network(c36830a6-66f7-4f28-8879-e228da46cead),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b36b1a4-9a') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.objects.instance [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lazy-loading 'pci_devices' on Instance uuid d4ea4d29-b178-4da2-b971-76f97031b244 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] End _get_guest_xml xml= Apr 20 16:03:00 user nova-compute[71605]: d4ea4d29-b178-4da2-b971-76f97031b244 Apr 20 16:03:00 user nova-compute[71605]: instance-00000002 Apr 20 16:03:00 user nova-compute[71605]: 131072 Apr 20 16:03:00 user nova-compute[71605]: 1 Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: tempest-ServersNegativeTestJSON-server-1335393395 Apr 20 16:03:00 user nova-compute[71605]: 2023-04-20 16:03:00 Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: 128 Apr 20 16:03:00 user nova-compute[71605]: 1 Apr 20 16:03:00 user nova-compute[71605]: 0 Apr 20 16:03:00 user nova-compute[71605]: 0 Apr 20 16:03:00 user nova-compute[71605]: 1 Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: tempest-ServersNegativeTestJSON-942369263-project-member Apr 20 16:03:00 user nova-compute[71605]: tempest-ServersNegativeTestJSON-942369263 Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: OpenStack Foundation Apr 20 16:03:00 user nova-compute[71605]: OpenStack Nova Apr 20 16:03:00 user nova-compute[71605]: 0.0.0 Apr 20 16:03:00 user nova-compute[71605]: d4ea4d29-b178-4da2-b971-76f97031b244 Apr 20 16:03:00 user nova-compute[71605]: d4ea4d29-b178-4da2-b971-76f97031b244 Apr 20 16:03:00 user nova-compute[71605]: Virtual Machine Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: hvm Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Nehalem Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: /dev/urandom Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: Apr 20 16:03:00 user nova-compute[71605]: {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:02:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1335393395',display_name='tempest-ServersNegativeTestJSON-server-1335393395',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1335393395',id=2,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d8444d3c8f554a56967917670b19dc37',ramdisk_id='',reservation_id='r-955d2plh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-942369263',owner_user_name='tempest-ServersNegativeTestJSON-942369263-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:02:53Z,user_data=None,user_id='9be25e958c6047068ab5ce63106b0754',uuid=d4ea4d29-b178-4da2-b971-76f97031b244,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b36b1a4-9ab6-49cb-9a5e-afc32792783e", "address": "fa:16:3e:44:d8:d0", "network": {"id": "c36830a6-66f7-4f28-8879-e228da46cead", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-655574662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8444d3c8f554a56967917670b19dc37", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b36b1a4-9a", "ovs_interfaceid": "0b36b1a4-9ab6-49cb-9a5e-afc32792783e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Converting VIF {"id": "0b36b1a4-9ab6-49cb-9a5e-afc32792783e", "address": "fa:16:3e:44:d8:d0", "network": {"id": "c36830a6-66f7-4f28-8879-e228da46cead", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-655574662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8444d3c8f554a56967917670b19dc37", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b36b1a4-9a", "ovs_interfaceid": "0b36b1a4-9ab6-49cb-9a5e-afc32792783e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:d8:d0,bridge_name='br-int',has_traffic_filtering=True,id=0b36b1a4-9ab6-49cb-9a5e-afc32792783e,network=Network(c36830a6-66f7-4f28-8879-e228da46cead),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b36b1a4-9a') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG os_vif [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:d8:d0,bridge_name='br-int',has_traffic_filtering=True,id=0b36b1a4-9ab6-49cb-9a5e-afc32792783e,network=Network(c36830a6-66f7-4f28-8879-e228da46cead),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b36b1a4-9a') {{(pid=71605) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:00 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.network.neutron [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Successfully created port: 5287c61f-56b9-4a9f-87e7-ab7057df84be {{(pid=71605) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.network.neutron [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Updating instance_info_cache with network_info: [{"id": "fe98bff4-7b0f-4244-a254-fc9359c00aae", "address": "fa:16:3e:8f:d5:e9", "network": {"id": "892f26a6-0815-41df-a910-d2e69e162820", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1019885347-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3336309776d848efaf237863a5b9bfeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe98bff4-7b", "ovs_interfaceid": "fe98bff4-7b0f-4244-a254-fc9359c00aae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Releasing lock "refresh_cache-6d55e5bd-9b03-40a9-bca9-88545039597c" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.compute.manager [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Instance network_info: |[{"id": "fe98bff4-7b0f-4244-a254-fc9359c00aae", "address": "fa:16:3e:8f:d5:e9", "network": {"id": "892f26a6-0815-41df-a910-d2e69e162820", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1019885347-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3336309776d848efaf237863a5b9bfeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe98bff4-7b", "ovs_interfaceid": "fe98bff4-7b0f-4244-a254-fc9359c00aae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-c4ebbf9c-17e1-464f-9452-bef2cbde2973 req-3bb9e808-c0e6-4398-a985-afdaf90e98a7 service nova] Acquired lock "refresh_cache-6d55e5bd-9b03-40a9-bca9-88545039597c" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.network.neutron [req-c4ebbf9c-17e1-464f-9452-bef2cbde2973 req-3bb9e808-c0e6-4398-a985-afdaf90e98a7 service nova] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Refreshing network info cache for port fe98bff4-7b0f-4244-a254-fc9359c00aae {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Start _get_guest_xml network_info=[{"id": "fe98bff4-7b0f-4244-a254-fc9359c00aae", "address": "fa:16:3e:8f:d5:e9", "network": {"id": "892f26a6-0815-41df-a910-d2e69e162820", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1019885347-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3336309776d848efaf237863a5b9bfeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe98bff4-7b", "ovs_interfaceid": "fe98bff4-7b0f-4244-a254-fc9359c00aae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'size': 0, 'encryption_format': None, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'image_id': '4ac69ea5-e5d7-40c8-864e-0a164d78a727'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 16:03:01 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:03:01 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71605) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T16:00:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=), allow threads: True {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Flavor limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Image limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Flavor pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Image pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Got 1 possible topologies {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:02:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-808094645',display_name='tempest-DeleteServersTestJSON-server-808094645',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-808094645',id=3,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3336309776d848efaf237863a5b9bfeb',ramdisk_id='',reservation_id='r-scto8378',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1315524687',owner_user_name='tempest-DeleteServersTestJSON-1315524687-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:02:54Z,user_data=None,user_id='8a7606e886554ff7948a4e246dd98677',uuid=6d55e5bd-9b03-40a9-bca9-88545039597c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fe98bff4-7b0f-4244-a254-fc9359c00aae", "address": "fa:16:3e:8f:d5:e9", "network": {"id": "892f26a6-0815-41df-a910-d2e69e162820", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1019885347-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3336309776d848efaf237863a5b9bfeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe98bff4-7b", "ovs_interfaceid": "fe98bff4-7b0f-4244-a254-fc9359c00aae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71605) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Converting VIF {"id": "fe98bff4-7b0f-4244-a254-fc9359c00aae", "address": "fa:16:3e:8f:d5:e9", "network": {"id": "892f26a6-0815-41df-a910-d2e69e162820", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1019885347-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3336309776d848efaf237863a5b9bfeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe98bff4-7b", "ovs_interfaceid": "fe98bff4-7b0f-4244-a254-fc9359c00aae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:d5:e9,bridge_name='br-int',has_traffic_filtering=True,id=fe98bff4-7b0f-4244-a254-fc9359c00aae,network=Network(892f26a6-0815-41df-a910-d2e69e162820),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe98bff4-7b') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.objects.instance [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Lazy-loading 'pci_devices' on Instance uuid 6d55e5bd-9b03-40a9-bca9-88545039597c {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] End _get_guest_xml xml= Apr 20 16:03:01 user nova-compute[71605]: 6d55e5bd-9b03-40a9-bca9-88545039597c Apr 20 16:03:01 user nova-compute[71605]: instance-00000003 Apr 20 16:03:01 user nova-compute[71605]: 131072 Apr 20 16:03:01 user nova-compute[71605]: 1 Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: tempest-DeleteServersTestJSON-server-808094645 Apr 20 16:03:01 user nova-compute[71605]: 2023-04-20 16:03:01 Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: 128 Apr 20 16:03:01 user nova-compute[71605]: 1 Apr 20 16:03:01 user nova-compute[71605]: 0 Apr 20 16:03:01 user nova-compute[71605]: 0 Apr 20 16:03:01 user nova-compute[71605]: 1 Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: tempest-DeleteServersTestJSON-1315524687-project-member Apr 20 16:03:01 user nova-compute[71605]: tempest-DeleteServersTestJSON-1315524687 Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: OpenStack Foundation Apr 20 16:03:01 user nova-compute[71605]: OpenStack Nova Apr 20 16:03:01 user nova-compute[71605]: 0.0.0 Apr 20 16:03:01 user nova-compute[71605]: 6d55e5bd-9b03-40a9-bca9-88545039597c Apr 20 16:03:01 user nova-compute[71605]: 6d55e5bd-9b03-40a9-bca9-88545039597c Apr 20 16:03:01 user nova-compute[71605]: Virtual Machine Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: hvm Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Nehalem Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: /dev/urandom Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: Apr 20 16:03:01 user nova-compute[71605]: {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:02:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-808094645',display_name='tempest-DeleteServersTestJSON-server-808094645',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-808094645',id=3,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3336309776d848efaf237863a5b9bfeb',ramdisk_id='',reservation_id='r-scto8378',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-1315524687',owner_user_name='tempest-DeleteServersTestJSON-1315524687-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:02:54Z,user_data=None,user_id='8a7606e886554ff7948a4e246dd98677',uuid=6d55e5bd-9b03-40a9-bca9-88545039597c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fe98bff4-7b0f-4244-a254-fc9359c00aae", "address": "fa:16:3e:8f:d5:e9", "network": {"id": "892f26a6-0815-41df-a910-d2e69e162820", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1019885347-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3336309776d848efaf237863a5b9bfeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe98bff4-7b", "ovs_interfaceid": "fe98bff4-7b0f-4244-a254-fc9359c00aae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Converting VIF {"id": "fe98bff4-7b0f-4244-a254-fc9359c00aae", "address": "fa:16:3e:8f:d5:e9", "network": {"id": "892f26a6-0815-41df-a910-d2e69e162820", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1019885347-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3336309776d848efaf237863a5b9bfeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe98bff4-7b", "ovs_interfaceid": "fe98bff4-7b0f-4244-a254-fc9359c00aae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:d5:e9,bridge_name='br-int',has_traffic_filtering=True,id=fe98bff4-7b0f-4244-a254-fc9359c00aae,network=Network(892f26a6-0815-41df-a910-d2e69e162820),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe98bff4-7b') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG os_vif [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:d5:e9,bridge_name='br-int',has_traffic_filtering=True,id=fe98bff4-7b0f-4244-a254-fc9359c00aae,network=Network(892f26a6-0815-41df-a910-d2e69e162820),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe98bff4-7b') {{(pid=71605) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.network.neutron [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Updating instance_info_cache with network_info: [{"id": "4bce4922-407c-4e11-b089-154a3299ea1c", "address": "fa:16:3e:bd:61:95", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bce4922-40", "ovs_interfaceid": "4bce4922-407c-4e11-b089-154a3299ea1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Releasing lock "refresh_cache-a5e68386-3b32-458b-9808-797d041c2235" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.compute.manager [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Instance network_info: |[{"id": "4bce4922-407c-4e11-b089-154a3299ea1c", "address": "fa:16:3e:bd:61:95", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bce4922-40", "ovs_interfaceid": "4bce4922-407c-4e11-b089-154a3299ea1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Start _get_guest_xml network_info=[{"id": "4bce4922-407c-4e11-b089-154a3299ea1c", "address": "fa:16:3e:bd:61:95", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bce4922-40", "ovs_interfaceid": "4bce4922-407c-4e11-b089-154a3299ea1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'size': 0, 'encryption_format': None, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'image_id': '4ac69ea5-e5d7-40c8-864e-0a164d78a727'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 16:03:01 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:03:01 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71605) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T16:00:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=), allow threads: True {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Flavor limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Image limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Flavor pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Image pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Got 1 possible topologies {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:02:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-112425079',display_name='tempest-AttachVolumeNegativeTest-server-112425079',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-112425079',id=1,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBFt/ltdSvvBHQ2MsuXJOTGRFwD86myzO9h0omThgGXoNYZmwXr9cWEFLEKbGl6QHLxLCdivOfggvbdx8hlLQgYsXTya/bJWP27fOABo2+ny5YKslC9RhYnn4AafsHJgFg==',key_name='tempest-keypair-1817335126',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71cf2664111f45788d24092e8ceede9c',ramdisk_id='',reservation_id='r-huby0z08',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-308436039',owner_user_name='tempest-AttachVolumeNegativeTest-308436039-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:02:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='690c49feae904687826fb959ba5ba283',uuid=a5e68386-3b32-458b-9808-797d041c2235,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4bce4922-407c-4e11-b089-154a3299ea1c", "address": "fa:16:3e:bd:61:95", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bce4922-40", "ovs_interfaceid": "4bce4922-407c-4e11-b089-154a3299ea1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71605) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Converting VIF {"id": "4bce4922-407c-4e11-b089-154a3299ea1c", "address": "fa:16:3e:bd:61:95", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bce4922-40", "ovs_interfaceid": "4bce4922-407c-4e11-b089-154a3299ea1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:03:01 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:61:95,bridge_name='br-int',has_traffic_filtering=True,id=4bce4922-407c-4e11-b089-154a3299ea1c,network=Network(2dc9b3da-0124-4718-9f70-a131cd030480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bce4922-40') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG nova.objects.instance [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lazy-loading 'pci_devices' on Instance uuid a5e68386-3b32-458b-9808-797d041c2235 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG nova.compute.manager [req-860312e7-9d2d-4f8e-820f-eb218f3b4b9e req-fba945ff-f740-4e46-918a-1bc309a34dde service nova] [instance: a5e68386-3b32-458b-9808-797d041c2235] Received event network-changed-4bce4922-407c-4e11-b089-154a3299ea1c {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG nova.compute.manager [req-860312e7-9d2d-4f8e-820f-eb218f3b4b9e req-fba945ff-f740-4e46-918a-1bc309a34dde service nova] [instance: a5e68386-3b32-458b-9808-797d041c2235] Refreshing instance network info cache due to event network-changed-4bce4922-407c-4e11-b089-154a3299ea1c. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-860312e7-9d2d-4f8e-820f-eb218f3b4b9e req-fba945ff-f740-4e46-918a-1bc309a34dde service nova] Acquiring lock "refresh_cache-a5e68386-3b32-458b-9808-797d041c2235" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-860312e7-9d2d-4f8e-820f-eb218f3b4b9e req-fba945ff-f740-4e46-918a-1bc309a34dde service nova] Acquired lock "refresh_cache-a5e68386-3b32-458b-9808-797d041c2235" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG nova.network.neutron [req-860312e7-9d2d-4f8e-820f-eb218f3b4b9e req-fba945ff-f740-4e46-918a-1bc309a34dde service nova] [instance: a5e68386-3b32-458b-9808-797d041c2235] Refreshing network info cache for port 4bce4922-407c-4e11-b089-154a3299ea1c {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] End _get_guest_xml xml= Apr 20 16:03:02 user nova-compute[71605]: a5e68386-3b32-458b-9808-797d041c2235 Apr 20 16:03:02 user nova-compute[71605]: instance-00000001 Apr 20 16:03:02 user nova-compute[71605]: 131072 Apr 20 16:03:02 user nova-compute[71605]: 1 Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: tempest-AttachVolumeNegativeTest-server-112425079 Apr 20 16:03:02 user nova-compute[71605]: 2023-04-20 16:03:01 Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: 128 Apr 20 16:03:02 user nova-compute[71605]: 1 Apr 20 16:03:02 user nova-compute[71605]: 0 Apr 20 16:03:02 user nova-compute[71605]: 0 Apr 20 16:03:02 user nova-compute[71605]: 1 Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: tempest-AttachVolumeNegativeTest-308436039-project-member Apr 20 16:03:02 user nova-compute[71605]: tempest-AttachVolumeNegativeTest-308436039 Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: OpenStack Foundation Apr 20 16:03:02 user nova-compute[71605]: OpenStack Nova Apr 20 16:03:02 user nova-compute[71605]: 0.0.0 Apr 20 16:03:02 user nova-compute[71605]: a5e68386-3b32-458b-9808-797d041c2235 Apr 20 16:03:02 user nova-compute[71605]: a5e68386-3b32-458b-9808-797d041c2235 Apr 20 16:03:02 user nova-compute[71605]: Virtual Machine Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: hvm Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Nehalem Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: /dev/urandom Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: Apr 20 16:03:02 user nova-compute[71605]: {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:02:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-112425079',display_name='tempest-AttachVolumeNegativeTest-server-112425079',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-112425079',id=1,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBFt/ltdSvvBHQ2MsuXJOTGRFwD86myzO9h0omThgGXoNYZmwXr9cWEFLEKbGl6QHLxLCdivOfggvbdx8hlLQgYsXTya/bJWP27fOABo2+ny5YKslC9RhYnn4AafsHJgFg==',key_name='tempest-keypair-1817335126',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71cf2664111f45788d24092e8ceede9c',ramdisk_id='',reservation_id='r-huby0z08',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-308436039',owner_user_name='tempest-AttachVolumeNegativeTest-308436039-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:02:53Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='690c49feae904687826fb959ba5ba283',uuid=a5e68386-3b32-458b-9808-797d041c2235,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4bce4922-407c-4e11-b089-154a3299ea1c", "address": "fa:16:3e:bd:61:95", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bce4922-40", "ovs_interfaceid": "4bce4922-407c-4e11-b089-154a3299ea1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Converting VIF {"id": "4bce4922-407c-4e11-b089-154a3299ea1c", "address": "fa:16:3e:bd:61:95", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bce4922-40", "ovs_interfaceid": "4bce4922-407c-4e11-b089-154a3299ea1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:61:95,bridge_name='br-int',has_traffic_filtering=True,id=4bce4922-407c-4e11-b089-154a3299ea1c,network=Network(2dc9b3da-0124-4718-9f70-a131cd030480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bce4922-40') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG os_vif [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:61:95,bridge_name='br-int',has_traffic_filtering=True,id=4bce4922-407c-4e11-b089-154a3299ea1c,network=Network(2dc9b3da-0124-4718-9f70-a131cd030480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bce4922-40') {{(pid=71605) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG nova.network.neutron [req-8f934614-8628-4417-b8a1-e7214b266c16 req-8c7a3dc9-137c-405b-85b1-ed26e3a5218f service nova] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Updated VIF entry in instance network info cache for port 0b36b1a4-9ab6-49cb-9a5e-afc32792783e. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG nova.network.neutron [req-8f934614-8628-4417-b8a1-e7214b266c16 req-8c7a3dc9-137c-405b-85b1-ed26e3a5218f service nova] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Updating instance_info_cache with network_info: [{"id": "0b36b1a4-9ab6-49cb-9a5e-afc32792783e", "address": "fa:16:3e:44:d8:d0", "network": {"id": "c36830a6-66f7-4f28-8879-e228da46cead", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-655574662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8444d3c8f554a56967917670b19dc37", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b36b1a4-9a", "ovs_interfaceid": "0b36b1a4-9ab6-49cb-9a5e-afc32792783e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-8f934614-8628-4417-b8a1-e7214b266c16 req-8c7a3dc9-137c-405b-85b1-ed26e3a5218f service nova] Releasing lock "refresh_cache-d4ea4d29-b178-4da2-b971-76f97031b244" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG nova.network.neutron [req-bf0d0ffd-6cf2-4ce8-8168-81ebf605576d req-24921570-59be-46fc-9589-76a3cece8581 service nova] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Updated VIF entry in instance network info cache for port b2af67f0-0768-4ebc-a21b-0ef6e2b3f264. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG nova.network.neutron [req-bf0d0ffd-6cf2-4ce8-8168-81ebf605576d req-24921570-59be-46fc-9589-76a3cece8581 service nova] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Updating instance_info_cache with network_info: [{"id": "b2af67f0-0768-4ebc-a21b-0ef6e2b3f264", "address": "fa:16:3e:d0:3f:7b", "network": {"id": "224391e3-9d6f-4e5f-b1bb-00dd1cd0ea06", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1568684394-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fbbcfeb5266f4ca6b9738b18ba7d127e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2af67f0-07", "ovs_interfaceid": "b2af67f0-0768-4ebc-a21b-0ef6e2b3f264", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-bf0d0ffd-6cf2-4ce8-8168-81ebf605576d req-24921570-59be-46fc-9589-76a3cece8581 service nova] Releasing lock "refresh_cache-91f4b3d1-0fea-4378-94e3-c2bbfd8cad81" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:02 user sudo[80426]: pam_unix(sudo:session): session closed for user root Apr 20 16:03:02 user nova-compute[71605]: INFO oslo.privsep.daemon [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Spawned new privsep daemon via rootwrap Apr 20 16:03:02 user nova-compute[71605]: INFO oslo.privsep.daemon [-] privsep daemon starting Apr 20 16:03:02 user nova-compute[71605]: INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 Apr 20 16:03:02 user nova-compute[71605]: INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none Apr 20 16:03:02 user nova-compute[71605]: INFO oslo.privsep.daemon [-] privsep daemon running as pid 80429 Apr 20 16:03:02 user nova-compute[71605]: WARNING oslo_privsep.priv_context [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] privsep daemon already running Apr 20 16:03:02 user nova-compute[71605]: WARNING oslo_privsep.priv_context [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] privsep daemon already running Apr 20 16:03:02 user nova-compute[71605]: WARNING oslo_privsep.priv_context [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] privsep daemon already running Apr 20 16:03:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2af67f0-07, may_exist=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb2af67f0-07, col_values=(('external_ids', {'iface-id': 'b2af67f0-0768-4ebc-a21b-0ef6e2b3f264', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d0:3f:7b', 'vm-uuid': '91f4b3d1-0fea-4378-94e3-c2bbfd8cad81'}),)) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0b36b1a4-9a, may_exist=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0b36b1a4-9a, col_values=(('external_ids', {'iface-id': '0b36b1a4-9ab6-49cb-9a5e-afc32792783e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:44:d8:d0', 'vm-uuid': 'd4ea4d29-b178-4da2-b971-76f97031b244'}),)) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:02 user nova-compute[71605]: INFO os_vif [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:3f:7b,bridge_name='br-int',has_traffic_filtering=True,id=b2af67f0-0768-4ebc-a21b-0ef6e2b3f264,network=Network(224391e3-9d6f-4e5f-b1bb-00dd1cd0ea06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2af67f0-07') Apr 20 16:03:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:02 user nova-compute[71605]: INFO os_vif [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:d8:d0,bridge_name='br-int',has_traffic_filtering=True,id=0b36b1a4-9ab6-49cb-9a5e-afc32792783e,network=Network(c36830a6-66f7-4f28-8879-e228da46cead),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b36b1a4-9a') Apr 20 16:03:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4bce4922-40, may_exist=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4bce4922-40, col_values=(('external_ids', {'iface-id': '4bce4922-407c-4e11-b089-154a3299ea1c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bd:61:95', 'vm-uuid': 'a5e68386-3b32-458b-9808-797d041c2235'}),)) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:02 user nova-compute[71605]: INFO os_vif [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:61:95,bridge_name='br-int',has_traffic_filtering=True,id=4bce4922-407c-4e11-b089-154a3299ea1c,network=Network(2dc9b3da-0124-4718-9f70-a131cd030480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bce4922-40') Apr 20 16:03:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe98bff4-7b, may_exist=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfe98bff4-7b, col_values=(('external_ids', {'iface-id': 'fe98bff4-7b0f-4244-a254-fc9359c00aae', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8f:d5:e9', 'vm-uuid': '6d55e5bd-9b03-40a9-bca9-88545039597c'}),)) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] No BDM found with device name vda, not building metadata. {{(pid=71605) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] No VIF found with MAC fa:16:3e:d0:3f:7b, not building metadata {{(pid=71605) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 16:03:02 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] No BDM found with device name vda, not building metadata. {{(pid=71605) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 16:03:03 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] No VIF found with MAC fa:16:3e:44:d8:d0, not building metadata {{(pid=71605) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 16:03:03 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:03 user nova-compute[71605]: INFO os_vif [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:d5:e9,bridge_name='br-int',has_traffic_filtering=True,id=fe98bff4-7b0f-4244-a254-fc9359c00aae,network=Network(892f26a6-0815-41df-a910-d2e69e162820),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe98bff4-7b') Apr 20 16:03:03 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] No BDM found with device name vda, not building metadata. {{(pid=71605) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 16:03:03 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] No VIF found with MAC fa:16:3e:bd:61:95, not building metadata {{(pid=71605) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 16:03:03 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] No BDM found with device name vda, not building metadata. {{(pid=71605) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 16:03:03 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] No VIF found with MAC fa:16:3e:8f:d5:e9, not building metadata {{(pid=71605) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 16:03:03 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:04 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:04 user nova-compute[71605]: DEBUG nova.network.neutron [req-c4ebbf9c-17e1-464f-9452-bef2cbde2973 req-3bb9e808-c0e6-4398-a985-afdaf90e98a7 service nova] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Updated VIF entry in instance network info cache for port fe98bff4-7b0f-4244-a254-fc9359c00aae. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:03:04 user nova-compute[71605]: DEBUG nova.network.neutron [req-c4ebbf9c-17e1-464f-9452-bef2cbde2973 req-3bb9e808-c0e6-4398-a985-afdaf90e98a7 service nova] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Updating instance_info_cache with network_info: [{"id": "fe98bff4-7b0f-4244-a254-fc9359c00aae", "address": "fa:16:3e:8f:d5:e9", "network": {"id": "892f26a6-0815-41df-a910-d2e69e162820", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1019885347-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3336309776d848efaf237863a5b9bfeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe98bff4-7b", "ovs_interfaceid": "fe98bff4-7b0f-4244-a254-fc9359c00aae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:03:04 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-c4ebbf9c-17e1-464f-9452-bef2cbde2973 req-3bb9e808-c0e6-4398-a985-afdaf90e98a7 service nova] Releasing lock "refresh_cache-6d55e5bd-9b03-40a9-bca9-88545039597c" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:03:04 user nova-compute[71605]: DEBUG nova.network.neutron [req-860312e7-9d2d-4f8e-820f-eb218f3b4b9e req-fba945ff-f740-4e46-918a-1bc309a34dde service nova] [instance: a5e68386-3b32-458b-9808-797d041c2235] Updated VIF entry in instance network info cache for port 4bce4922-407c-4e11-b089-154a3299ea1c. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:03:04 user nova-compute[71605]: DEBUG nova.network.neutron [req-860312e7-9d2d-4f8e-820f-eb218f3b4b9e req-fba945ff-f740-4e46-918a-1bc309a34dde service nova] [instance: a5e68386-3b32-458b-9808-797d041c2235] Updating instance_info_cache with network_info: [{"id": "4bce4922-407c-4e11-b089-154a3299ea1c", "address": "fa:16:3e:bd:61:95", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bce4922-40", "ovs_interfaceid": "4bce4922-407c-4e11-b089-154a3299ea1c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:03:04 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-860312e7-9d2d-4f8e-820f-eb218f3b4b9e req-fba945ff-f740-4e46-918a-1bc309a34dde service nova] Releasing lock "refresh_cache-a5e68386-3b32-458b-9808-797d041c2235" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:03:05 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:05 user nova-compute[71605]: DEBUG nova.network.neutron [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Successfully updated port: 5287c61f-56b9-4a9f-87e7-ab7057df84be {{(pid=71605) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 16:03:05 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Acquiring lock "refresh_cache-5bda996a-1bfe-4f43-aa02-36a864153588" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:03:05 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Acquired lock "refresh_cache-5bda996a-1bfe-4f43-aa02-36a864153588" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:03:05 user nova-compute[71605]: DEBUG nova.network.neutron [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Building network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 16:03:05 user nova-compute[71605]: DEBUG nova.compute.manager [req-67aff657-3e49-4b49-81d7-4895f0cd3e60 req-cdf4dfd2-385f-4e3b-9af9-e568f6621873 service nova] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Received event network-changed-5287c61f-56b9-4a9f-87e7-ab7057df84be {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:03:05 user nova-compute[71605]: DEBUG nova.compute.manager [req-67aff657-3e49-4b49-81d7-4895f0cd3e60 req-cdf4dfd2-385f-4e3b-9af9-e568f6621873 service nova] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Refreshing instance network info cache due to event network-changed-5287c61f-56b9-4a9f-87e7-ab7057df84be. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:03:05 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-67aff657-3e49-4b49-81d7-4895f0cd3e60 req-cdf4dfd2-385f-4e3b-9af9-e568f6621873 service nova] Acquiring lock "refresh_cache-5bda996a-1bfe-4f43-aa02-36a864153588" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:03:05 user nova-compute[71605]: DEBUG nova.network.neutron [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Instance cache missing network info. {{(pid=71605) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG nova.network.neutron [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Updating instance_info_cache with network_info: [{"id": "5287c61f-56b9-4a9f-87e7-ab7057df84be", "address": "fa:16:3e:0c:45:0b", "network": {"id": "545a57d8-9d55-4ace-a0ad-635d7bc0ae52", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1085059550-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0a5eb3796a4d3a871843f409c6ffbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5287c61f-56", "ovs_interfaceid": "5287c61f-56b9-4a9f-87e7-ab7057df84be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Releasing lock "refresh_cache-5bda996a-1bfe-4f43-aa02-36a864153588" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG nova.compute.manager [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Instance network_info: |[{"id": "5287c61f-56b9-4a9f-87e7-ab7057df84be", "address": "fa:16:3e:0c:45:0b", "network": {"id": "545a57d8-9d55-4ace-a0ad-635d7bc0ae52", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1085059550-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0a5eb3796a4d3a871843f409c6ffbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5287c61f-56", "ovs_interfaceid": "5287c61f-56b9-4a9f-87e7-ab7057df84be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-67aff657-3e49-4b49-81d7-4895f0cd3e60 req-cdf4dfd2-385f-4e3b-9af9-e568f6621873 service nova] Acquired lock "refresh_cache-5bda996a-1bfe-4f43-aa02-36a864153588" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG nova.network.neutron [req-67aff657-3e49-4b49-81d7-4895f0cd3e60 req-cdf4dfd2-385f-4e3b-9af9-e568f6621873 service nova] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Refreshing network info cache for port 5287c61f-56b9-4a9f-87e7-ab7057df84be {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Start _get_guest_xml network_info=[{"id": "5287c61f-56b9-4a9f-87e7-ab7057df84be", "address": "fa:16:3e:0c:45:0b", "network": {"id": "545a57d8-9d55-4ace-a0ad-635d7bc0ae52", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1085059550-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0a5eb3796a4d3a871843f409c6ffbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5287c61f-56", "ovs_interfaceid": "5287c61f-56b9-4a9f-87e7-ab7057df84be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'size': 0, 'encryption_format': None, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'image_id': '4ac69ea5-e5d7-40c8-864e-0a164d78a727'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 16:03:06 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:03:06 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:03:06 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71605) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T16:00:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=), allow threads: True {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Flavor limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Image limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Flavor pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Image pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Got 1 possible topologies {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:02:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-577930116',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-577930116',id=5,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMkHFFsWtozUTkF0VpQ+Cd6z15wOd291X4e8/v6QbZKdTx6+gptvNMQSpe0ybBenimgtpgGav2HnMz19ylSDLLeiOEgxywkrcPA8Jq0CjCrxBO54bQ0ViTd2ITYv71kQ9Q==',key_name='tempest-keypair-1173247378',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cb0a5eb3796a4d3a871843f409c6ffbd',ramdisk_id='',reservation_id='r-mig0m4d8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1118127371',owner_user_name='tempest-AttachVolumeShelveTestJSON-1118127371-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:02:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='f50dbce30f294bb0ba6bc2811025835d',uuid=5bda996a-1bfe-4f43-aa02-36a864153588,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5287c61f-56b9-4a9f-87e7-ab7057df84be", "address": "fa:16:3e:0c:45:0b", "network": {"id": "545a57d8-9d55-4ace-a0ad-635d7bc0ae52", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1085059550-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0a5eb3796a4d3a871843f409c6ffbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5287c61f-56", "ovs_interfaceid": "5287c61f-56b9-4a9f-87e7-ab7057df84be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71605) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Converting VIF {"id": "5287c61f-56b9-4a9f-87e7-ab7057df84be", "address": "fa:16:3e:0c:45:0b", "network": {"id": "545a57d8-9d55-4ace-a0ad-635d7bc0ae52", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1085059550-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0a5eb3796a4d3a871843f409c6ffbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5287c61f-56", "ovs_interfaceid": "5287c61f-56b9-4a9f-87e7-ab7057df84be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0c:45:0b,bridge_name='br-int',has_traffic_filtering=True,id=5287c61f-56b9-4a9f-87e7-ab7057df84be,network=Network(545a57d8-9d55-4ace-a0ad-635d7bc0ae52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5287c61f-56') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG nova.objects.instance [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lazy-loading 'pci_devices' on Instance uuid 5bda996a-1bfe-4f43-aa02-36a864153588 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] End _get_guest_xml xml= Apr 20 16:03:06 user nova-compute[71605]: 5bda996a-1bfe-4f43-aa02-36a864153588 Apr 20 16:03:06 user nova-compute[71605]: instance-00000005 Apr 20 16:03:06 user nova-compute[71605]: 131072 Apr 20 16:03:06 user nova-compute[71605]: 1 Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: tempest-AttachVolumeShelveTestJSON-server-577930116 Apr 20 16:03:06 user nova-compute[71605]: 2023-04-20 16:03:06 Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: 128 Apr 20 16:03:06 user nova-compute[71605]: 1 Apr 20 16:03:06 user nova-compute[71605]: 0 Apr 20 16:03:06 user nova-compute[71605]: 0 Apr 20 16:03:06 user nova-compute[71605]: 1 Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: tempest-AttachVolumeShelveTestJSON-1118127371-project-member Apr 20 16:03:06 user nova-compute[71605]: tempest-AttachVolumeShelveTestJSON-1118127371 Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: OpenStack Foundation Apr 20 16:03:06 user nova-compute[71605]: OpenStack Nova Apr 20 16:03:06 user nova-compute[71605]: 0.0.0 Apr 20 16:03:06 user nova-compute[71605]: 5bda996a-1bfe-4f43-aa02-36a864153588 Apr 20 16:03:06 user nova-compute[71605]: 5bda996a-1bfe-4f43-aa02-36a864153588 Apr 20 16:03:06 user nova-compute[71605]: Virtual Machine Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: hvm Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Nehalem Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: /dev/urandom Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: Apr 20 16:03:06 user nova-compute[71605]: {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:02:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-577930116',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-577930116',id=5,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMkHFFsWtozUTkF0VpQ+Cd6z15wOd291X4e8/v6QbZKdTx6+gptvNMQSpe0ybBenimgtpgGav2HnMz19ylSDLLeiOEgxywkrcPA8Jq0CjCrxBO54bQ0ViTd2ITYv71kQ9Q==',key_name='tempest-keypair-1173247378',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cb0a5eb3796a4d3a871843f409c6ffbd',ramdisk_id='',reservation_id='r-mig0m4d8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1118127371',owner_user_name='tempest-AttachVolumeShelveTestJSON-1118127371-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:02:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='f50dbce30f294bb0ba6bc2811025835d',uuid=5bda996a-1bfe-4f43-aa02-36a864153588,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5287c61f-56b9-4a9f-87e7-ab7057df84be", "address": "fa:16:3e:0c:45:0b", "network": {"id": "545a57d8-9d55-4ace-a0ad-635d7bc0ae52", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1085059550-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0a5eb3796a4d3a871843f409c6ffbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5287c61f-56", "ovs_interfaceid": "5287c61f-56b9-4a9f-87e7-ab7057df84be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Converting VIF {"id": "5287c61f-56b9-4a9f-87e7-ab7057df84be", "address": "fa:16:3e:0c:45:0b", "network": {"id": "545a57d8-9d55-4ace-a0ad-635d7bc0ae52", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1085059550-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0a5eb3796a4d3a871843f409c6ffbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5287c61f-56", "ovs_interfaceid": "5287c61f-56b9-4a9f-87e7-ab7057df84be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0c:45:0b,bridge_name='br-int',has_traffic_filtering=True,id=5287c61f-56b9-4a9f-87e7-ab7057df84be,network=Network(545a57d8-9d55-4ace-a0ad-635d7bc0ae52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5287c61f-56') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG os_vif [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0c:45:0b,bridge_name='br-int',has_traffic_filtering=True,id=5287c61f-56b9-4a9f-87e7-ab7057df84be,network=Network(545a57d8-9d55-4ace-a0ad-635d7bc0ae52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5287c61f-56') {{(pid=71605) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5287c61f-56, may_exist=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5287c61f-56, col_values=(('external_ids', {'iface-id': '5287c61f-56b9-4a9f-87e7-ab7057df84be', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0c:45:0b', 'vm-uuid': '5bda996a-1bfe-4f43-aa02-36a864153588'}),)) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:03:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:06 user nova-compute[71605]: INFO os_vif [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0c:45:0b,bridge_name='br-int',has_traffic_filtering=True,id=5287c61f-56b9-4a9f-87e7-ab7057df84be,network=Network(545a57d8-9d55-4ace-a0ad-635d7bc0ae52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5287c61f-56') Apr 20 16:03:07 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:07 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:07 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:07 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] No BDM found with device name vda, not building metadata. {{(pid=71605) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 16:03:07 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] No VIF found with MAC fa:16:3e:0c:45:0b, not building metadata {{(pid=71605) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 16:03:07 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:07 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:07 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:07 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:07 user nova-compute[71605]: DEBUG nova.compute.manager [req-8b34de21-0a0d-4408-a90b-3e36c080f5f4 req-7caf4b2b-e821-4ba3-978e-e40af379fd8a service nova] [instance: a5e68386-3b32-458b-9808-797d041c2235] Received event network-vif-plugged-4bce4922-407c-4e11-b089-154a3299ea1c {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:03:07 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-8b34de21-0a0d-4408-a90b-3e36c080f5f4 req-7caf4b2b-e821-4ba3-978e-e40af379fd8a service nova] Acquiring lock "a5e68386-3b32-458b-9808-797d041c2235-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:07 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-8b34de21-0a0d-4408-a90b-3e36c080f5f4 req-7caf4b2b-e821-4ba3-978e-e40af379fd8a service nova] Lock "a5e68386-3b32-458b-9808-797d041c2235-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:07 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-8b34de21-0a0d-4408-a90b-3e36c080f5f4 req-7caf4b2b-e821-4ba3-978e-e40af379fd8a service nova] Lock "a5e68386-3b32-458b-9808-797d041c2235-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:07 user nova-compute[71605]: DEBUG nova.compute.manager [req-8b34de21-0a0d-4408-a90b-3e36c080f5f4 req-7caf4b2b-e821-4ba3-978e-e40af379fd8a service nova] [instance: a5e68386-3b32-458b-9808-797d041c2235] No waiting events found dispatching network-vif-plugged-4bce4922-407c-4e11-b089-154a3299ea1c {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:03:07 user nova-compute[71605]: WARNING nova.compute.manager [req-8b34de21-0a0d-4408-a90b-3e36c080f5f4 req-7caf4b2b-e821-4ba3-978e-e40af379fd8a service nova] [instance: a5e68386-3b32-458b-9808-797d041c2235] Received unexpected event network-vif-plugged-4bce4922-407c-4e11-b089-154a3299ea1c for instance with vm_state building and task_state spawning. Apr 20 16:03:07 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:07 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:07 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:07 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:07 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:08 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:08 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:08 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:08 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:09 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:09 user nova-compute[71605]: DEBUG nova.network.neutron [req-67aff657-3e49-4b49-81d7-4895f0cd3e60 req-cdf4dfd2-385f-4e3b-9af9-e568f6621873 service nova] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Updated VIF entry in instance network info cache for port 5287c61f-56b9-4a9f-87e7-ab7057df84be. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:03:09 user nova-compute[71605]: DEBUG nova.network.neutron [req-67aff657-3e49-4b49-81d7-4895f0cd3e60 req-cdf4dfd2-385f-4e3b-9af9-e568f6621873 service nova] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Updating instance_info_cache with network_info: [{"id": "5287c61f-56b9-4a9f-87e7-ab7057df84be", "address": "fa:16:3e:0c:45:0b", "network": {"id": "545a57d8-9d55-4ace-a0ad-635d7bc0ae52", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1085059550-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0a5eb3796a4d3a871843f409c6ffbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5287c61f-56", "ovs_interfaceid": "5287c61f-56b9-4a9f-87e7-ab7057df84be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:03:09 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-67aff657-3e49-4b49-81d7-4895f0cd3e60 req-cdf4dfd2-385f-4e3b-9af9-e568f6621873 service nova] Releasing lock "refresh_cache-5bda996a-1bfe-4f43-aa02-36a864153588" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:03:09 user nova-compute[71605]: DEBUG nova.compute.manager [req-a609f9dd-7a2d-45f7-b23f-53e6dab49936 req-fff486e5-b941-4fce-9d11-34972838c37b service nova] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Received event network-vif-plugged-b2af67f0-0768-4ebc-a21b-0ef6e2b3f264 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:03:09 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-a609f9dd-7a2d-45f7-b23f-53e6dab49936 req-fff486e5-b941-4fce-9d11-34972838c37b service nova] Acquiring lock "91f4b3d1-0fea-4378-94e3-c2bbfd8cad81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:09 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-a609f9dd-7a2d-45f7-b23f-53e6dab49936 req-fff486e5-b941-4fce-9d11-34972838c37b service nova] Lock "91f4b3d1-0fea-4378-94e3-c2bbfd8cad81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:09 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-a609f9dd-7a2d-45f7-b23f-53e6dab49936 req-fff486e5-b941-4fce-9d11-34972838c37b service nova] Lock "91f4b3d1-0fea-4378-94e3-c2bbfd8cad81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:09 user nova-compute[71605]: DEBUG nova.compute.manager [req-a609f9dd-7a2d-45f7-b23f-53e6dab49936 req-fff486e5-b941-4fce-9d11-34972838c37b service nova] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] No waiting events found dispatching network-vif-plugged-b2af67f0-0768-4ebc-a21b-0ef6e2b3f264 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:03:09 user nova-compute[71605]: WARNING nova.compute.manager [req-a609f9dd-7a2d-45f7-b23f-53e6dab49936 req-fff486e5-b941-4fce-9d11-34972838c37b service nova] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Received unexpected event network-vif-plugged-b2af67f0-0768-4ebc-a21b-0ef6e2b3f264 for instance with vm_state building and task_state spawning. Apr 20 16:03:09 user nova-compute[71605]: DEBUG nova.compute.manager [req-32e94bfa-e0a0-4d17-93e0-22a4cb52008c req-80c7e9c9-bb7e-4656-a58f-3878958e4823 service nova] [instance: a5e68386-3b32-458b-9808-797d041c2235] Received event network-vif-plugged-4bce4922-407c-4e11-b089-154a3299ea1c {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:03:09 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-32e94bfa-e0a0-4d17-93e0-22a4cb52008c req-80c7e9c9-bb7e-4656-a58f-3878958e4823 service nova] Acquiring lock "a5e68386-3b32-458b-9808-797d041c2235-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:09 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-32e94bfa-e0a0-4d17-93e0-22a4cb52008c req-80c7e9c9-bb7e-4656-a58f-3878958e4823 service nova] Lock "a5e68386-3b32-458b-9808-797d041c2235-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.007s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:09 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-32e94bfa-e0a0-4d17-93e0-22a4cb52008c req-80c7e9c9-bb7e-4656-a58f-3878958e4823 service nova] Lock "a5e68386-3b32-458b-9808-797d041c2235-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:09 user nova-compute[71605]: DEBUG nova.compute.manager [req-32e94bfa-e0a0-4d17-93e0-22a4cb52008c req-80c7e9c9-bb7e-4656-a58f-3878958e4823 service nova] [instance: a5e68386-3b32-458b-9808-797d041c2235] No waiting events found dispatching network-vif-plugged-4bce4922-407c-4e11-b089-154a3299ea1c {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:03:09 user nova-compute[71605]: WARNING nova.compute.manager [req-32e94bfa-e0a0-4d17-93e0-22a4cb52008c req-80c7e9c9-bb7e-4656-a58f-3878958e4823 service nova] [instance: a5e68386-3b32-458b-9808-797d041c2235] Received unexpected event network-vif-plugged-4bce4922-407c-4e11-b089-154a3299ea1c for instance with vm_state building and task_state spawning. Apr 20 16:03:09 user nova-compute[71605]: DEBUG nova.compute.manager [req-32e94bfa-e0a0-4d17-93e0-22a4cb52008c req-80c7e9c9-bb7e-4656-a58f-3878958e4823 service nova] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Received event network-vif-plugged-0b36b1a4-9ab6-49cb-9a5e-afc32792783e {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:03:09 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-32e94bfa-e0a0-4d17-93e0-22a4cb52008c req-80c7e9c9-bb7e-4656-a58f-3878958e4823 service nova] Acquiring lock "d4ea4d29-b178-4da2-b971-76f97031b244-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:09 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-32e94bfa-e0a0-4d17-93e0-22a4cb52008c req-80c7e9c9-bb7e-4656-a58f-3878958e4823 service nova] Lock "d4ea4d29-b178-4da2-b971-76f97031b244-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:09 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-32e94bfa-e0a0-4d17-93e0-22a4cb52008c req-80c7e9c9-bb7e-4656-a58f-3878958e4823 service nova] Lock "d4ea4d29-b178-4da2-b971-76f97031b244-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:09 user nova-compute[71605]: DEBUG nova.compute.manager [req-32e94bfa-e0a0-4d17-93e0-22a4cb52008c req-80c7e9c9-bb7e-4656-a58f-3878958e4823 service nova] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] No waiting events found dispatching network-vif-plugged-0b36b1a4-9ab6-49cb-9a5e-afc32792783e {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:03:09 user nova-compute[71605]: WARNING nova.compute.manager [req-32e94bfa-e0a0-4d17-93e0-22a4cb52008c req-80c7e9c9-bb7e-4656-a58f-3878958e4823 service nova] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Received unexpected event network-vif-plugged-0b36b1a4-9ab6-49cb-9a5e-afc32792783e for instance with vm_state building and task_state spawning. Apr 20 16:03:09 user nova-compute[71605]: DEBUG nova.compute.manager [req-32e94bfa-e0a0-4d17-93e0-22a4cb52008c req-80c7e9c9-bb7e-4656-a58f-3878958e4823 service nova] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Received event network-vif-plugged-0b36b1a4-9ab6-49cb-9a5e-afc32792783e {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:03:09 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-32e94bfa-e0a0-4d17-93e0-22a4cb52008c req-80c7e9c9-bb7e-4656-a58f-3878958e4823 service nova] Acquiring lock "d4ea4d29-b178-4da2-b971-76f97031b244-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:09 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-32e94bfa-e0a0-4d17-93e0-22a4cb52008c req-80c7e9c9-bb7e-4656-a58f-3878958e4823 service nova] Lock "d4ea4d29-b178-4da2-b971-76f97031b244-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:09 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-32e94bfa-e0a0-4d17-93e0-22a4cb52008c req-80c7e9c9-bb7e-4656-a58f-3878958e4823 service nova] Lock "d4ea4d29-b178-4da2-b971-76f97031b244-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:09 user nova-compute[71605]: DEBUG nova.compute.manager [req-32e94bfa-e0a0-4d17-93e0-22a4cb52008c req-80c7e9c9-bb7e-4656-a58f-3878958e4823 service nova] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] No waiting events found dispatching network-vif-plugged-0b36b1a4-9ab6-49cb-9a5e-afc32792783e {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:03:09 user nova-compute[71605]: WARNING nova.compute.manager [req-32e94bfa-e0a0-4d17-93e0-22a4cb52008c req-80c7e9c9-bb7e-4656-a58f-3878958e4823 service nova] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Received unexpected event network-vif-plugged-0b36b1a4-9ab6-49cb-9a5e-afc32792783e for instance with vm_state building and task_state spawning. Apr 20 16:03:09 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:09 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:10 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:10 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:10 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:10 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.compute.manager [req-441a6f58-80b3-4689-9119-a8166d384573 req-594a6f67-ac68-4003-ac2a-c35f1b362728 service nova] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Received event network-vif-plugged-5287c61f-56b9-4a9f-87e7-ab7057df84be {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-441a6f58-80b3-4689-9119-a8166d384573 req-594a6f67-ac68-4003-ac2a-c35f1b362728 service nova] Acquiring lock "5bda996a-1bfe-4f43-aa02-36a864153588-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-441a6f58-80b3-4689-9119-a8166d384573 req-594a6f67-ac68-4003-ac2a-c35f1b362728 service nova] Lock "5bda996a-1bfe-4f43-aa02-36a864153588-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-441a6f58-80b3-4689-9119-a8166d384573 req-594a6f67-ac68-4003-ac2a-c35f1b362728 service nova] Lock "5bda996a-1bfe-4f43-aa02-36a864153588-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.compute.manager [req-441a6f58-80b3-4689-9119-a8166d384573 req-594a6f67-ac68-4003-ac2a-c35f1b362728 service nova] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] No waiting events found dispatching network-vif-plugged-5287c61f-56b9-4a9f-87e7-ab7057df84be {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:03:11 user nova-compute[71605]: WARNING nova.compute.manager [req-441a6f58-80b3-4689-9119-a8166d384573 req-594a6f67-ac68-4003-ac2a-c35f1b362728 service nova] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Received unexpected event network-vif-plugged-5287c61f-56b9-4a9f-87e7-ab7057df84be for instance with vm_state building and task_state spawning. Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.compute.manager [req-185976cc-3261-4eb0-a054-3ed8f53b4245 req-b9edb87d-6211-48fc-b5f6-358f561c181e service nova] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Received event network-vif-plugged-b2af67f0-0768-4ebc-a21b-0ef6e2b3f264 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-185976cc-3261-4eb0-a054-3ed8f53b4245 req-b9edb87d-6211-48fc-b5f6-358f561c181e service nova] Acquiring lock "91f4b3d1-0fea-4378-94e3-c2bbfd8cad81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-185976cc-3261-4eb0-a054-3ed8f53b4245 req-b9edb87d-6211-48fc-b5f6-358f561c181e service nova] Lock "91f4b3d1-0fea-4378-94e3-c2bbfd8cad81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-185976cc-3261-4eb0-a054-3ed8f53b4245 req-b9edb87d-6211-48fc-b5f6-358f561c181e service nova] Lock "91f4b3d1-0fea-4378-94e3-c2bbfd8cad81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.compute.manager [req-185976cc-3261-4eb0-a054-3ed8f53b4245 req-b9edb87d-6211-48fc-b5f6-358f561c181e service nova] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] No waiting events found dispatching network-vif-plugged-b2af67f0-0768-4ebc-a21b-0ef6e2b3f264 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:03:11 user nova-compute[71605]: WARNING nova.compute.manager [req-185976cc-3261-4eb0-a054-3ed8f53b4245 req-b9edb87d-6211-48fc-b5f6-358f561c181e service nova] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Received unexpected event network-vif-plugged-b2af67f0-0768-4ebc-a21b-0ef6e2b3f264 for instance with vm_state building and task_state spawning. Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.compute.manager [req-185976cc-3261-4eb0-a054-3ed8f53b4245 req-b9edb87d-6211-48fc-b5f6-358f561c181e service nova] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Received event network-vif-plugged-fe98bff4-7b0f-4244-a254-fc9359c00aae {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-185976cc-3261-4eb0-a054-3ed8f53b4245 req-b9edb87d-6211-48fc-b5f6-358f561c181e service nova] Acquiring lock "6d55e5bd-9b03-40a9-bca9-88545039597c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-185976cc-3261-4eb0-a054-3ed8f53b4245 req-b9edb87d-6211-48fc-b5f6-358f561c181e service nova] Lock "6d55e5bd-9b03-40a9-bca9-88545039597c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-185976cc-3261-4eb0-a054-3ed8f53b4245 req-b9edb87d-6211-48fc-b5f6-358f561c181e service nova] Lock "6d55e5bd-9b03-40a9-bca9-88545039597c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.compute.manager [req-185976cc-3261-4eb0-a054-3ed8f53b4245 req-b9edb87d-6211-48fc-b5f6-358f561c181e service nova] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] No waiting events found dispatching network-vif-plugged-fe98bff4-7b0f-4244-a254-fc9359c00aae {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:03:11 user nova-compute[71605]: WARNING nova.compute.manager [req-185976cc-3261-4eb0-a054-3ed8f53b4245 req-b9edb87d-6211-48fc-b5f6-358f561c181e service nova] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Received unexpected event network-vif-plugged-fe98bff4-7b0f-4244-a254-fc9359c00aae for instance with vm_state building and task_state spawning. Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.compute.manager [req-185976cc-3261-4eb0-a054-3ed8f53b4245 req-b9edb87d-6211-48fc-b5f6-358f561c181e service nova] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Received event network-vif-plugged-fe98bff4-7b0f-4244-a254-fc9359c00aae {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-185976cc-3261-4eb0-a054-3ed8f53b4245 req-b9edb87d-6211-48fc-b5f6-358f561c181e service nova] Acquiring lock "6d55e5bd-9b03-40a9-bca9-88545039597c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-185976cc-3261-4eb0-a054-3ed8f53b4245 req-b9edb87d-6211-48fc-b5f6-358f561c181e service nova] Lock "6d55e5bd-9b03-40a9-bca9-88545039597c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-185976cc-3261-4eb0-a054-3ed8f53b4245 req-b9edb87d-6211-48fc-b5f6-358f561c181e service nova] Lock "6d55e5bd-9b03-40a9-bca9-88545039597c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.compute.manager [req-185976cc-3261-4eb0-a054-3ed8f53b4245 req-b9edb87d-6211-48fc-b5f6-358f561c181e service nova] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] No waiting events found dispatching network-vif-plugged-fe98bff4-7b0f-4244-a254-fc9359c00aae {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:03:11 user nova-compute[71605]: WARNING nova.compute.manager [req-185976cc-3261-4eb0-a054-3ed8f53b4245 req-b9edb87d-6211-48fc-b5f6-358f561c181e service nova] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Received unexpected event network-vif-plugged-fe98bff4-7b0f-4244-a254-fc9359c00aae for instance with vm_state building and task_state spawning. Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.compute.manager [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Instance event wait completed in 0 seconds for {{(pid=71605) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Guest created on hypervisor {{(pid=71605) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Resumed> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:03:11 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] VM Resumed (Lifecycle Event) Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.compute.manager [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Instance event wait completed in 0 seconds for {{(pid=71605) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Guest created on hypervisor {{(pid=71605) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 16:03:11 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Instance spawned successfully. Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.compute.manager [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Instance event wait completed in 0 seconds for {{(pid=71605) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Guest created on hypervisor {{(pid=71605) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 16:03:11 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Instance spawned successfully. Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 16:03:11 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: a5e68386-3b32-458b-9808-797d041c2235] Instance spawned successfully. Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Found default for hw_cdrom_bus of ide {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Found default for hw_disk_bus of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Found default for hw_input_bus of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Found default for hw_pointer_model of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Found default for hw_video_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Found default for hw_vif_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Found default for hw_cdrom_bus of ide {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Found default for hw_disk_bus of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Found default for hw_input_bus of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Found default for hw_pointer_model of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Found default for hw_video_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Found default for hw_vif_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:11 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Started> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:03:11 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] VM Started (Lifecycle Event) Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:03:11 user nova-compute[71605]: INFO nova.compute.manager [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Took 17.40 seconds to spawn the instance on the hypervisor. Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.compute.manager [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.compute.manager [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Instance event wait completed in 0 seconds for {{(pid=71605) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Guest created on hypervisor {{(pid=71605) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:03:11 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Instance spawned successfully. Apr 20 16:03:11 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Found default for hw_cdrom_bus of ide {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Found default for hw_disk_bus of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Found default for hw_input_bus of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Found default for hw_pointer_model of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Found default for hw_video_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Found default for hw_vif_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Found default for hw_cdrom_bus of ide {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Found default for hw_disk_bus of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Found default for hw_input_bus of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Found default for hw_pointer_model of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Found default for hw_video_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Found default for hw_vif_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:12 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Resumed> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:03:12 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: a5e68386-3b32-458b-9808-797d041c2235] VM Resumed (Lifecycle Event) Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: a5e68386-3b32-458b-9808-797d041c2235] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: a5e68386-3b32-458b-9808-797d041c2235] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:03:12 user nova-compute[71605]: INFO nova.compute.manager [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Took 18.26 seconds to spawn the instance on the hypervisor. Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.compute.manager [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:03:12 user nova-compute[71605]: INFO nova.compute.manager [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Took 18.46 seconds to build instance. Apr 20 16:03:12 user nova-compute[71605]: INFO nova.compute.manager [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Took 18.88 seconds to spawn the instance on the hypervisor. Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.compute.manager [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:03:12 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: a5e68386-3b32-458b-9808-797d041c2235] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Started> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:03:12 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: a5e68386-3b32-458b-9808-797d041c2235] VM Started (Lifecycle Event) Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: a5e68386-3b32-458b-9808-797d041c2235] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: a5e68386-3b32-458b-9808-797d041c2235] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:03:12 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: a5e68386-3b32-458b-9808-797d041c2235] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Resumed> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:03:12 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] VM Resumed (Lifecycle Event) Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-dc4c5aca-05b6-4b89-8249-daba730d9721 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Lock "91f4b3d1-0fea-4378-94e3-c2bbfd8cad81" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 19.042s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:12 user nova-compute[71605]: INFO nova.compute.manager [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Took 19.95 seconds to build instance. Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:03:12 user nova-compute[71605]: INFO nova.compute.manager [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Took 19.35 seconds to spawn the instance on the hypervisor. Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.compute.manager [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Started> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:03:12 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] VM Started (Lifecycle Event) Apr 20 16:03:12 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b5e7ee3c-4e99-4c4a-8ef1-6559580f48e6 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Lock "6d55e5bd-9b03-40a9-bca9-88545039597c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 20.097s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:12 user nova-compute[71605]: INFO nova.compute.manager [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Took 20.07 seconds to build instance. Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-013f893b-bbaf-49f0-8539-c25b22e45b60 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "d4ea4d29-b178-4da2-b971-76f97031b244" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 20.225s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Resumed> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:03:12 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] VM Resumed (Lifecycle Event) Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:03:12 user nova-compute[71605]: INFO nova.compute.manager [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Took 20.65 seconds to build instance. Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Started> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:03:12 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] VM Started (Lifecycle Event) Apr 20 16:03:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5157c134-78bd-4aef-8c9e-48c14bf85791 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "a5e68386-3b32-458b-9808-797d041c2235" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 20.875s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.compute.manager [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Instance event wait completed in 0 seconds for {{(pid=71605) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Guest created on hypervisor {{(pid=71605) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:03:12 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Instance spawned successfully. Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Resumed> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:03:12 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] VM Resumed (Lifecycle Event) Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Found default for hw_cdrom_bus of ide {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Found default for hw_disk_bus of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Found default for hw_input_bus of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Found default for hw_pointer_model of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Found default for hw_video_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Found default for hw_vif_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:12 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:03:13 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:03:13 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Started> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:03:13 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] VM Started (Lifecycle Event) Apr 20 16:03:13 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:03:13 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:03:13 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:03:13 user nova-compute[71605]: INFO nova.compute.manager [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Took 15.94 seconds to spawn the instance on the hypervisor. Apr 20 16:03:13 user nova-compute[71605]: DEBUG nova.compute.manager [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:03:13 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:13 user nova-compute[71605]: INFO nova.compute.manager [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Took 17.20 seconds to build instance. Apr 20 16:03:13 user nova-compute[71605]: DEBUG nova.compute.manager [req-f031dbb1-a1b1-4276-beaa-810969b1f75a req-87277225-e967-408e-ad76-a23185866ed2 service nova] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Received event network-vif-plugged-5287c61f-56b9-4a9f-87e7-ab7057df84be {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:03:13 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-f031dbb1-a1b1-4276-beaa-810969b1f75a req-87277225-e967-408e-ad76-a23185866ed2 service nova] Acquiring lock "5bda996a-1bfe-4f43-aa02-36a864153588-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:13 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-f031dbb1-a1b1-4276-beaa-810969b1f75a req-87277225-e967-408e-ad76-a23185866ed2 service nova] Lock "5bda996a-1bfe-4f43-aa02-36a864153588-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:13 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-f031dbb1-a1b1-4276-beaa-810969b1f75a req-87277225-e967-408e-ad76-a23185866ed2 service nova] Lock "5bda996a-1bfe-4f43-aa02-36a864153588-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:13 user nova-compute[71605]: DEBUG nova.compute.manager [req-f031dbb1-a1b1-4276-beaa-810969b1f75a req-87277225-e967-408e-ad76-a23185866ed2 service nova] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] No waiting events found dispatching network-vif-plugged-5287c61f-56b9-4a9f-87e7-ab7057df84be {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:03:13 user nova-compute[71605]: WARNING nova.compute.manager [req-f031dbb1-a1b1-4276-beaa-810969b1f75a req-87277225-e967-408e-ad76-a23185866ed2 service nova] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Received unexpected event network-vif-plugged-5287c61f-56b9-4a9f-87e7-ab7057df84be for instance with vm_state active and task_state None. Apr 20 16:03:13 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-e1ce0302-5ad4-4b0c-8429-ba00ad84a16f tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "5bda996a-1bfe-4f43-aa02-36a864153588" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 17.364s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:14 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:14 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:14 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:15 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:15 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:19 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:21 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:21 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:23 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:23 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Acquiring lock "65fc650d-2181-46cb-b91b-4a1104b2afab" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:23 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Lock "65fc650d-2181-46cb-b91b-4a1104b2afab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:23 user nova-compute[71605]: DEBUG nova.compute.manager [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Starting instance... {{(pid=71605) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 16:03:23 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:24 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:24 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71605) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 16:03:24 user nova-compute[71605]: INFO nova.compute.claims [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Claim successful on node user Apr 20 16:03:24 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:24 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:03:24 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:03:24 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.846s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:24 user nova-compute[71605]: DEBUG nova.compute.manager [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Start building networks asynchronously for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 16:03:24 user nova-compute[71605]: DEBUG nova.compute.manager [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Allocating IP information in the background. {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 16:03:24 user nova-compute[71605]: DEBUG nova.network.neutron [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] allocate_for_instance() {{(pid=71605) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 16:03:24 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Ignoring supplied device name: /dev/sda. Libvirt can't honour user-supplied dev names Apr 20 16:03:25 user nova-compute[71605]: DEBUG nova.compute.manager [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Start building block device mappings for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 16:03:25 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:25 user nova-compute[71605]: DEBUG nova.compute.manager [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Start spawning the instance on the hypervisor. {{(pid=71605) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 16:03:25 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Creating instance directory {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 16:03:25 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Creating image(s) Apr 20 16:03:25 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Acquiring lock "/opt/stack/data/nova/instances/65fc650d-2181-46cb-b91b-4a1104b2afab/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:25 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Lock "/opt/stack/data/nova/instances/65fc650d-2181-46cb-b91b-4a1104b2afab/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:25 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Lock "/opt/stack/data/nova/instances/65fc650d-2181-46cb-b91b-4a1104b2afab/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:25 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Acquiring lock "ce762eaa4305041e89787e826cfcd91d9c303494" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:25 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Lock "ce762eaa4305041e89787e826cfcd91d9c303494" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:25 user nova-compute[71605]: DEBUG nova.policy [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b21609ce02ce4ed2ba4f8f5d668da192', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '132831801cee4fb185cc27c9792ff5ad', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71605) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 16:03:25 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ce762eaa4305041e89787e826cfcd91d9c303494.part --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:25 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:25 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ce762eaa4305041e89787e826cfcd91d9c303494.part --force-share --output=json" returned: 0 in 0.162s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:25 user nova-compute[71605]: DEBUG nova.virt.images [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] 4c26d9f3-9ee3-471e-b427-e25c3c09175c was qcow2, converting to raw {{(pid=71605) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 20 16:03:26 user nova-compute[71605]: DEBUG nova.privsep.utils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71605) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 20 16:03:26 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/ce762eaa4305041e89787e826cfcd91d9c303494.part /opt/stack/data/nova/instances/_base/ce762eaa4305041e89787e826cfcd91d9c303494.converted {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:26 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/ce762eaa4305041e89787e826cfcd91d9c303494.part /opt/stack/data/nova/instances/_base/ce762eaa4305041e89787e826cfcd91d9c303494.converted" returned: 0 in 0.354s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:26 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ce762eaa4305041e89787e826cfcd91d9c303494.converted --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:26 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ce762eaa4305041e89787e826cfcd91d9c303494.converted --force-share --output=json" returned: 0 in 0.174s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:26 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Lock "ce762eaa4305041e89787e826cfcd91d9c303494" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 1.378s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:26 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ce762eaa4305041e89787e826cfcd91d9c303494 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:26 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ce762eaa4305041e89787e826cfcd91d9c303494 --force-share --output=json" returned: 0 in 0.211s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:26 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Acquiring lock "ce762eaa4305041e89787e826cfcd91d9c303494" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:26 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Lock "ce762eaa4305041e89787e826cfcd91d9c303494" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.003s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:26 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ce762eaa4305041e89787e826cfcd91d9c303494 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:27 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ce762eaa4305041e89787e826cfcd91d9c303494 --force-share --output=json" returned: 0 in 0.190s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:27 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/ce762eaa4305041e89787e826cfcd91d9c303494,backing_fmt=raw /opt/stack/data/nova/instances/65fc650d-2181-46cb-b91b-4a1104b2afab/disk 1073741824 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:27 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/ce762eaa4305041e89787e826cfcd91d9c303494,backing_fmt=raw /opt/stack/data/nova/instances/65fc650d-2181-46cb-b91b-4a1104b2afab/disk 1073741824" returned: 0 in 0.054s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:27 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Lock "ce762eaa4305041e89787e826cfcd91d9c303494" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.253s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:27 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ce762eaa4305041e89787e826cfcd91d9c303494 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:27 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ce762eaa4305041e89787e826cfcd91d9c303494 --force-share --output=json" returned: 0 in 0.191s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:27 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Checking if we can resize image /opt/stack/data/nova/instances/65fc650d-2181-46cb-b91b-4a1104b2afab/disk. size=1073741824 {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 16:03:27 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/65fc650d-2181-46cb-b91b-4a1104b2afab/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:27 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/65fc650d-2181-46cb-b91b-4a1104b2afab/disk --force-share --output=json" returned: 0 in 0.196s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:27 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Cannot resize image /opt/stack/data/nova/instances/65fc650d-2181-46cb-b91b-4a1104b2afab/disk to a smaller size. {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 16:03:27 user nova-compute[71605]: DEBUG nova.objects.instance [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Lazy-loading 'migration_context' on Instance uuid 65fc650d-2181-46cb-b91b-4a1104b2afab {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:03:27 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Created local disks {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 16:03:27 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Ensure instance console log exists: /opt/stack/data/nova/instances/65fc650d-2181-46cb-b91b-4a1104b2afab/console.log {{(pid=71605) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 16:03:27 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:27 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:27 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:28 user nova-compute[71605]: DEBUG nova.network.neutron [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Successfully created port: 5c711d7a-9f6d-49dd-af46-c3c1056f702e {{(pid=71605) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 16:03:29 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:29 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:29 user nova-compute[71605]: DEBUG nova.network.neutron [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Successfully updated port: 5c711d7a-9f6d-49dd-af46-c3c1056f702e {{(pid=71605) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 16:03:29 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Acquiring lock "refresh_cache-65fc650d-2181-46cb-b91b-4a1104b2afab" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:03:29 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Acquired lock "refresh_cache-65fc650d-2181-46cb-b91b-4a1104b2afab" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:03:29 user nova-compute[71605]: DEBUG nova.network.neutron [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Building network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 16:03:29 user nova-compute[71605]: DEBUG nova.compute.manager [req-b7aa4b11-68a3-46b3-af12-56f02d8b0bf5 req-96c06e07-27e5-4734-9fee-7fe4ec72f249 service nova] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Received event network-changed-5c711d7a-9f6d-49dd-af46-c3c1056f702e {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:03:29 user nova-compute[71605]: DEBUG nova.compute.manager [req-b7aa4b11-68a3-46b3-af12-56f02d8b0bf5 req-96c06e07-27e5-4734-9fee-7fe4ec72f249 service nova] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Refreshing instance network info cache due to event network-changed-5c711d7a-9f6d-49dd-af46-c3c1056f702e. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:03:29 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-b7aa4b11-68a3-46b3-af12-56f02d8b0bf5 req-96c06e07-27e5-4734-9fee-7fe4ec72f249 service nova] Acquiring lock "refresh_cache-65fc650d-2181-46cb-b91b-4a1104b2afab" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:03:29 user nova-compute[71605]: DEBUG nova.network.neutron [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Instance cache missing network info. {{(pid=71605) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 16:03:29 user nova-compute[71605]: DEBUG nova.network.neutron [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Updating instance_info_cache with network_info: [{"id": "5c711d7a-9f6d-49dd-af46-c3c1056f702e", "address": "fa:16:3e:70:93:1c", "network": {"id": "0bc5d911-da2e-4f4e-9427-2332d7a5bd08", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-494297859-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "132831801cee4fb185cc27c9792ff5ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c711d7a-9f", "ovs_interfaceid": "5c711d7a-9f6d-49dd-af46-c3c1056f702e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:03:29 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Releasing lock "refresh_cache-65fc650d-2181-46cb-b91b-4a1104b2afab" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:03:29 user nova-compute[71605]: DEBUG nova.compute.manager [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Instance network_info: |[{"id": "5c711d7a-9f6d-49dd-af46-c3c1056f702e", "address": "fa:16:3e:70:93:1c", "network": {"id": "0bc5d911-da2e-4f4e-9427-2332d7a5bd08", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-494297859-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "132831801cee4fb185cc27c9792ff5ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c711d7a-9f", "ovs_interfaceid": "5c711d7a-9f6d-49dd-af46-c3c1056f702e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 16:03:29 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-b7aa4b11-68a3-46b3-af12-56f02d8b0bf5 req-96c06e07-27e5-4734-9fee-7fe4ec72f249 service nova] Acquired lock "refresh_cache-65fc650d-2181-46cb-b91b-4a1104b2afab" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:03:29 user nova-compute[71605]: DEBUG nova.network.neutron [req-b7aa4b11-68a3-46b3-af12-56f02d8b0bf5 req-96c06e07-27e5-4734-9fee-7fe4ec72f249 service nova] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Refreshing network info cache for port 5c711d7a-9f6d-49dd-af46-c3c1056f702e {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:03:29 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Start _get_guest_xml network_info=[{"id": "5c711d7a-9f6d-49dd-af46-c3c1056f702e", "address": "fa:16:3e:70:93:1c", "network": {"id": "0bc5d911-da2e-4f4e-9427-2332d7a5bd08", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-494297859-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "132831801cee4fb185cc27c9792ff5ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c711d7a-9f", "ovs_interfaceid": "5c711d7a-9f6d-49dd-af46-c3c1056f702e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'scsi', 'cdrom_bus': 'scsi', 'mapping': {'root': {'bus': 'scsi', 'dev': 'sda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'scsi', 'dev': 'sda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'scsi', 'dev': 'sdb', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T16:03:12Z,direct_url=,disk_format='qcow2',id=4c26d9f3-9ee3-471e-b427-e25c3c09175c,min_disk=0,min_ram=0,name='',owner='f97b5dc0562846029a0fc40283f5cfad',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T16:03:15Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/sda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'size': 0, 'encryption_format': None, 'guest_format': None, 'device_name': '/dev/sda', 'disk_bus': 'scsi', 'image_id': '4c26d9f3-9ee3-471e-b427-e25c3c09175c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 16:03:29 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:03:29 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:03:29 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71605) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 16:03:29 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T16:00:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T16:03:12Z,direct_url=,disk_format='qcow2',id=4c26d9f3-9ee3-471e-b427-e25c3c09175c,min_disk=0,min_ram=0,name='',owner='f97b5dc0562846029a0fc40283f5cfad',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T16:03:15Z,virtual_size=,visibility=), allow threads: True {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 16:03:29 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Flavor limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 16:03:29 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Image limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 16:03:29 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Flavor pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 16:03:29 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Image pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 16:03:29 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 16:03:29 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 16:03:29 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 16:03:29 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Got 1 possible topologies {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 16:03:29 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 16:03:29 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 16:03:30 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2023-04-20T16:03:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-1701348631',display_name='tempest-AttachSCSIVolumeTestJSON-server-1701348631',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-1701348631',id=6,image_ref='4c26d9f3-9ee3-471e-b427-e25c3c09175c',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD2/JSQECxpyAoht39JYGnv52skhhDoF+ZMQJeVxL2a6UlTIckPD/ph8VMozU2wYXOiMIRgZDapWk23cxn+Rk7SbiF9E3tzwwP5mxsK4xqXHETPbeDxqHRE+MDclya79IQ==',key_name='tempest-keypair-1188279305',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='132831801cee4fb185cc27c9792ff5ad',ramdisk_id='',reservation_id='r-7xqvsqeu',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4c26d9f3-9ee3-471e-b427-e25c3c09175c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_machine_type='pc',image_hw_scsi_model='virtio-scsi',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachSCSIVolumeTestJSON-838012861',owner_user_name='tempest-AttachSCSIVolumeTestJSON-838012861-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:03:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b21609ce02ce4ed2ba4f8f5d668da192',uuid=65fc650d-2181-46cb-b91b-4a1104b2afab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5c711d7a-9f6d-49dd-af46-c3c1056f702e", "address": "fa:16:3e:70:93:1c", "network": {"id": "0bc5d911-da2e-4f4e-9427-2332d7a5bd08", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-494297859-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "132831801cee4fb185cc27c9792ff5ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c711d7a-9f", "ovs_interfaceid": "5c711d7a-9f6d-49dd-af46-c3c1056f702e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71605) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 16:03:30 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Converting VIF {"id": "5c711d7a-9f6d-49dd-af46-c3c1056f702e", "address": "fa:16:3e:70:93:1c", "network": {"id": "0bc5d911-da2e-4f4e-9427-2332d7a5bd08", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-494297859-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "132831801cee4fb185cc27c9792ff5ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c711d7a-9f", "ovs_interfaceid": "5c711d7a-9f6d-49dd-af46-c3c1056f702e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:03:30 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:70:93:1c,bridge_name='br-int',has_traffic_filtering=True,id=5c711d7a-9f6d-49dd-af46-c3c1056f702e,network=Network(0bc5d911-da2e-4f4e-9427-2332d7a5bd08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c711d7a-9f') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:03:30 user nova-compute[71605]: DEBUG nova.objects.instance [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Lazy-loading 'pci_devices' on Instance uuid 65fc650d-2181-46cb-b91b-4a1104b2afab {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:03:30 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] End _get_guest_xml xml= Apr 20 16:03:30 user nova-compute[71605]: 65fc650d-2181-46cb-b91b-4a1104b2afab Apr 20 16:03:30 user nova-compute[71605]: instance-00000006 Apr 20 16:03:30 user nova-compute[71605]: 131072 Apr 20 16:03:30 user nova-compute[71605]: 1 Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: tempest-AttachSCSIVolumeTestJSON-server-1701348631 Apr 20 16:03:30 user nova-compute[71605]: 2023-04-20 16:03:29 Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: 128 Apr 20 16:03:30 user nova-compute[71605]: 1 Apr 20 16:03:30 user nova-compute[71605]: 0 Apr 20 16:03:30 user nova-compute[71605]: 0 Apr 20 16:03:30 user nova-compute[71605]: 1 Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: tempest-AttachSCSIVolumeTestJSON-838012861-project-member Apr 20 16:03:30 user nova-compute[71605]: tempest-AttachSCSIVolumeTestJSON-838012861 Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: OpenStack Foundation Apr 20 16:03:30 user nova-compute[71605]: OpenStack Nova Apr 20 16:03:30 user nova-compute[71605]: 0.0.0 Apr 20 16:03:30 user nova-compute[71605]: 65fc650d-2181-46cb-b91b-4a1104b2afab Apr 20 16:03:30 user nova-compute[71605]: 65fc650d-2181-46cb-b91b-4a1104b2afab Apr 20 16:03:30 user nova-compute[71605]: Virtual Machine Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: hvm Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Nehalem Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]:
Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]:
Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: /dev/urandom Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: Apr 20 16:03:30 user nova-compute[71605]: {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 16:03:30 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2023-04-20T16:03:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-1701348631',display_name='tempest-AttachSCSIVolumeTestJSON-server-1701348631',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-1701348631',id=6,image_ref='4c26d9f3-9ee3-471e-b427-e25c3c09175c',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD2/JSQECxpyAoht39JYGnv52skhhDoF+ZMQJeVxL2a6UlTIckPD/ph8VMozU2wYXOiMIRgZDapWk23cxn+Rk7SbiF9E3tzwwP5mxsK4xqXHETPbeDxqHRE+MDclya79IQ==',key_name='tempest-keypair-1188279305',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='132831801cee4fb185cc27c9792ff5ad',ramdisk_id='',reservation_id='r-7xqvsqeu',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4c26d9f3-9ee3-471e-b427-e25c3c09175c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_machine_type='pc',image_hw_scsi_model='virtio-scsi',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachSCSIVolumeTestJSON-838012861',owner_user_name='tempest-AttachSCSIVolumeTestJSON-838012861-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:03:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b21609ce02ce4ed2ba4f8f5d668da192',uuid=65fc650d-2181-46cb-b91b-4a1104b2afab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5c711d7a-9f6d-49dd-af46-c3c1056f702e", "address": "fa:16:3e:70:93:1c", "network": {"id": "0bc5d911-da2e-4f4e-9427-2332d7a5bd08", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-494297859-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "132831801cee4fb185cc27c9792ff5ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c711d7a-9f", "ovs_interfaceid": "5c711d7a-9f6d-49dd-af46-c3c1056f702e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 16:03:30 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Converting VIF {"id": "5c711d7a-9f6d-49dd-af46-c3c1056f702e", "address": "fa:16:3e:70:93:1c", "network": {"id": "0bc5d911-da2e-4f4e-9427-2332d7a5bd08", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-494297859-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "132831801cee4fb185cc27c9792ff5ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c711d7a-9f", "ovs_interfaceid": "5c711d7a-9f6d-49dd-af46-c3c1056f702e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:03:30 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:70:93:1c,bridge_name='br-int',has_traffic_filtering=True,id=5c711d7a-9f6d-49dd-af46-c3c1056f702e,network=Network(0bc5d911-da2e-4f4e-9427-2332d7a5bd08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c711d7a-9f') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:03:30 user nova-compute[71605]: DEBUG os_vif [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:70:93:1c,bridge_name='br-int',has_traffic_filtering=True,id=5c711d7a-9f6d-49dd-af46-c3c1056f702e,network=Network(0bc5d911-da2e-4f4e-9427-2332d7a5bd08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c711d7a-9f') {{(pid=71605) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 16:03:30 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:30 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:30 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 16:03:30 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:30 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5c711d7a-9f, may_exist=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:30 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5c711d7a-9f, col_values=(('external_ids', {'iface-id': '5c711d7a-9f6d-49dd-af46-c3c1056f702e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:70:93:1c', 'vm-uuid': '65fc650d-2181-46cb-b91b-4a1104b2afab'}),)) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:30 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:30 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:03:30 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:30 user nova-compute[71605]: INFO os_vif [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:70:93:1c,bridge_name='br-int',has_traffic_filtering=True,id=5c711d7a-9f6d-49dd-af46-c3c1056f702e,network=Network(0bc5d911-da2e-4f4e-9427-2332d7a5bd08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c711d7a-9f') Apr 20 16:03:30 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] No BDM found with device name sda, not building metadata. {{(pid=71605) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 16:03:30 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] No BDM found with device name sdb, not building metadata. {{(pid=71605) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 16:03:30 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] No VIF found with MAC fa:16:3e:70:93:1c, not building metadata {{(pid=71605) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 16:03:30 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Using config drive Apr 20 16:03:30 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Creating config drive at /opt/stack/data/nova/instances/65fc650d-2181-46cb-b91b-4a1104b2afab/disk.config Apr 20 16:03:30 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/65fc650d-2181-46cb-b91b-4a1104b2afab/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 0.0.0 -quiet -J -r -V config-2 /tmp/tmp22fd3jfp {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:30 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/65fc650d-2181-46cb-b91b-4a1104b2afab/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 0.0.0 -quiet -J -r -V config-2 /tmp/tmp22fd3jfp" returned: 0 in 0.062s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:30 user nova-compute[71605]: DEBUG nova.network.neutron [req-b7aa4b11-68a3-46b3-af12-56f02d8b0bf5 req-96c06e07-27e5-4734-9fee-7fe4ec72f249 service nova] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Updated VIF entry in instance network info cache for port 5c711d7a-9f6d-49dd-af46-c3c1056f702e. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:03:30 user nova-compute[71605]: DEBUG nova.network.neutron [req-b7aa4b11-68a3-46b3-af12-56f02d8b0bf5 req-96c06e07-27e5-4734-9fee-7fe4ec72f249 service nova] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Updating instance_info_cache with network_info: [{"id": "5c711d7a-9f6d-49dd-af46-c3c1056f702e", "address": "fa:16:3e:70:93:1c", "network": {"id": "0bc5d911-da2e-4f4e-9427-2332d7a5bd08", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-494297859-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "132831801cee4fb185cc27c9792ff5ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c711d7a-9f", "ovs_interfaceid": "5c711d7a-9f6d-49dd-af46-c3c1056f702e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:03:30 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-b7aa4b11-68a3-46b3-af12-56f02d8b0bf5 req-96c06e07-27e5-4734-9fee-7fe4ec72f249 service nova] Releasing lock "refresh_cache-65fc650d-2181-46cb-b91b-4a1104b2afab" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:03:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:32 user nova-compute[71605]: DEBUG nova.compute.manager [req-1b120fbb-549d-4ab6-9ddb-23d9a829b97d req-141c6e0c-c675-4a1e-8ffd-b40d4f8cfd8f service nova] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Received event network-vif-plugged-5c711d7a-9f6d-49dd-af46-c3c1056f702e {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:03:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-1b120fbb-549d-4ab6-9ddb-23d9a829b97d req-141c6e0c-c675-4a1e-8ffd-b40d4f8cfd8f service nova] Acquiring lock "65fc650d-2181-46cb-b91b-4a1104b2afab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-1b120fbb-549d-4ab6-9ddb-23d9a829b97d req-141c6e0c-c675-4a1e-8ffd-b40d4f8cfd8f service nova] Lock "65fc650d-2181-46cb-b91b-4a1104b2afab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-1b120fbb-549d-4ab6-9ddb-23d9a829b97d req-141c6e0c-c675-4a1e-8ffd-b40d4f8cfd8f service nova] Lock "65fc650d-2181-46cb-b91b-4a1104b2afab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:32 user nova-compute[71605]: DEBUG nova.compute.manager [req-1b120fbb-549d-4ab6-9ddb-23d9a829b97d req-141c6e0c-c675-4a1e-8ffd-b40d4f8cfd8f service nova] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] No waiting events found dispatching network-vif-plugged-5c711d7a-9f6d-49dd-af46-c3c1056f702e {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:03:32 user nova-compute[71605]: WARNING nova.compute.manager [req-1b120fbb-549d-4ab6-9ddb-23d9a829b97d req-141c6e0c-c675-4a1e-8ffd-b40d4f8cfd8f service nova] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Received unexpected event network-vif-plugged-5c711d7a-9f6d-49dd-af46-c3c1056f702e for instance with vm_state building and task_state spawning. Apr 20 16:03:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:34 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Resumed> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:03:34 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] VM Resumed (Lifecycle Event) Apr 20 16:03:34 user nova-compute[71605]: DEBUG nova.compute.manager [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Instance event wait completed in 0 seconds for {{(pid=71605) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 16:03:34 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Guest created on hypervisor {{(pid=71605) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 16:03:34 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Instance spawned successfully. Apr 20 16:03:34 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Attempting to register defaults for the following image properties: ['hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 16:03:34 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Found default for hw_input_bus of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:34 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Found default for hw_pointer_model of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:34 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Found default for hw_video_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:34 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Found default for hw_vif_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:34 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:03:34 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:03:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:34 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:03:34 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Started> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:03:34 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] VM Started (Lifecycle Event) Apr 20 16:03:34 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:03:34 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:03:34 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:03:34 user nova-compute[71605]: INFO nova.compute.manager [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Took 9.11 seconds to spawn the instance on the hypervisor. Apr 20 16:03:34 user nova-compute[71605]: DEBUG nova.compute.manager [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:03:34 user nova-compute[71605]: INFO nova.compute.manager [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Took 10.47 seconds to build instance. Apr 20 16:03:34 user nova-compute[71605]: DEBUG nova.compute.manager [req-a805844b-f721-43d8-8098-b3f88f82d466 req-3208585b-9f20-4508-bbe6-28e0a1b9af7f service nova] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Received event network-vif-plugged-5c711d7a-9f6d-49dd-af46-c3c1056f702e {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:03:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-a805844b-f721-43d8-8098-b3f88f82d466 req-3208585b-9f20-4508-bbe6-28e0a1b9af7f service nova] Acquiring lock "65fc650d-2181-46cb-b91b-4a1104b2afab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-a805844b-f721-43d8-8098-b3f88f82d466 req-3208585b-9f20-4508-bbe6-28e0a1b9af7f service nova] Lock "65fc650d-2181-46cb-b91b-4a1104b2afab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-a805844b-f721-43d8-8098-b3f88f82d466 req-3208585b-9f20-4508-bbe6-28e0a1b9af7f service nova] Lock "65fc650d-2181-46cb-b91b-4a1104b2afab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:34 user nova-compute[71605]: DEBUG nova.compute.manager [req-a805844b-f721-43d8-8098-b3f88f82d466 req-3208585b-9f20-4508-bbe6-28e0a1b9af7f service nova] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] No waiting events found dispatching network-vif-plugged-5c711d7a-9f6d-49dd-af46-c3c1056f702e {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:03:34 user nova-compute[71605]: WARNING nova.compute.manager [req-a805844b-f721-43d8-8098-b3f88f82d466 req-3208585b-9f20-4508-bbe6-28e0a1b9af7f service nova] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Received unexpected event network-vif-plugged-5c711d7a-9f6d-49dd-af46-c3c1056f702e for instance with vm_state active and task_state None. Apr 20 16:03:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-572d3e04-4cd7-488d-8910-f59e71f4984a tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Lock "65fc650d-2181-46cb-b91b-4a1104b2afab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.582s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:35 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:36 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:03:36 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Cleaning up deleted instances {{(pid=71605) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 20 16:03:36 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] There are 0 instances to clean {{(pid=71605) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 20 16:03:36 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:03:36 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Cleaning up deleted instances with incomplete migration {{(pid=71605) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 20 16:03:36 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:03:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:37 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:37 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:38 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:03:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:39 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:03:39 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:03:39 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:03:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:39 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:03:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:39 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6d55e5bd-9b03-40a9-bca9-88545039597c/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:39 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6d55e5bd-9b03-40a9-bca9-88545039597c/disk --force-share --output=json" returned: 0 in 0.167s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:39 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6d55e5bd-9b03-40a9-bca9-88545039597c/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:39 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6d55e5bd-9b03-40a9-bca9-88545039597c/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:39 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91f4b3d1-0fea-4378-94e3-c2bbfd8cad81/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91f4b3d1-0fea-4378-94e3-c2bbfd8cad81/disk --force-share --output=json" returned: 0 in 0.155s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91f4b3d1-0fea-4378-94e3-c2bbfd8cad81/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:40 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91f4b3d1-0fea-4378-94e3-c2bbfd8cad81/disk --force-share --output=json" returned: 0 in 0.150s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5bda996a-1bfe-4f43-aa02-36a864153588/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:40 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5bda996a-1bfe-4f43-aa02-36a864153588/disk --force-share --output=json" returned: 0 in 0.330s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5bda996a-1bfe-4f43-aa02-36a864153588/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5bda996a-1bfe-4f43-aa02-36a864153588/disk --force-share --output=json" returned: 0 in 0.161s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/65fc650d-2181-46cb-b91b-4a1104b2afab/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/65fc650d-2181-46cb-b91b-4a1104b2afab/disk --force-share --output=json" returned: 0 in 0.171s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/65fc650d-2181-46cb-b91b-4a1104b2afab/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/65fc650d-2181-46cb-b91b-4a1104b2afab/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a5e68386-3b32-458b-9808-797d041c2235/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a5e68386-3b32-458b-9808-797d041c2235/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a5e68386-3b32-458b-9808-797d041c2235/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a5e68386-3b32-458b-9808-797d041c2235/disk --force-share --output=json" returned: 0 in 0.153s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:42 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:03:42 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:03:42 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=8074MB free_disk=26.381519317626953GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:03:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:42 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance a5e68386-3b32-458b-9808-797d041c2235 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:03:42 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance 6d55e5bd-9b03-40a9-bca9-88545039597c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:03:42 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance d4ea4d29-b178-4da2-b971-76f97031b244 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:03:42 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:03:42 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance 5bda996a-1bfe-4f43-aa02-36a864153588 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:03:42 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance 65fc650d-2181-46cb-b91b-4a1104b2afab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:03:42 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 6 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:03:42 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=1280MB phys_disk=40GB used_disk=6GB total_vcpus=12 used_vcpus=6 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:03:42 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:03:42 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:03:42 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:03:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.480s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:43 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:03:43 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:03:43 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:03:43 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Rebuilding the list of instances to heal {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 16:03:43 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "refresh_cache-a5e68386-3b32-458b-9808-797d041c2235" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:03:43 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquired lock "refresh_cache-a5e68386-3b32-458b-9808-797d041c2235" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:03:43 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: a5e68386-3b32-458b-9808-797d041c2235] Forcefully refreshing network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 16:03:43 user nova-compute[71605]: DEBUG nova.objects.instance [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lazy-loading 'info_cache' on Instance uuid a5e68386-3b32-458b-9808-797d041c2235 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:03:44 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:44 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:45 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:45 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: a5e68386-3b32-458b-9808-797d041c2235] Updating instance_info_cache with network_info: [{"id": "4bce4922-407c-4e11-b089-154a3299ea1c", "address": "fa:16:3e:bd:61:95", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bce4922-40", "ovs_interfaceid": "4bce4922-407c-4e11-b089-154a3299ea1c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:03:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Releasing lock "refresh_cache-a5e68386-3b32-458b-9808-797d041c2235" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:03:45 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: a5e68386-3b32-458b-9808-797d041c2235] Updated the network info_cache for instance {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 16:03:45 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:03:45 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:03:45 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:03:45 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:03:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Acquiring lock "e1036e0f-683f-4dfd-b0ad-6187d90ff2f6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "e1036e0f-683f-4dfd-b0ad-6187d90ff2f6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:45 user nova-compute[71605]: DEBUG nova.compute.manager [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Starting instance... {{(pid=71605) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 16:03:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:46 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71605) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 16:03:46 user nova-compute[71605]: INFO nova.compute.claims [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Claim successful on node user Apr 20 16:03:46 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Refreshing inventories for resource provider 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 20 16:03:46 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Updating ProviderTree inventory for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 20 16:03:46 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Updating inventory in ProviderTree for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 20 16:03:46 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Refreshing aggregate associations for resource provider 00e9f769-1a1c-4f1e-80e4-b19657803102, aggregates: None {{(pid=71605) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 20 16:03:46 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Refreshing trait associations for resource provider 00e9f769-1a1c-4f1e-80e4-b19657803102, traits: COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_MMX,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI {{(pid=71605) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 20 16:03:46 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:03:46 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:03:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.615s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:46 user nova-compute[71605]: DEBUG nova.compute.manager [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Start building networks asynchronously for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 16:03:46 user nova-compute[71605]: DEBUG nova.compute.manager [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Allocating IP information in the background. {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 16:03:46 user nova-compute[71605]: DEBUG nova.network.neutron [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] allocate_for_instance() {{(pid=71605) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 16:03:46 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 16:03:46 user nova-compute[71605]: DEBUG nova.compute.manager [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Start building block device mappings for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 16:03:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Acquiring lock "e8f62d46-e2dc-4870-adf1-f62d88bb653b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "e8f62d46-e2dc-4870-adf1-f62d88bb653b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:46 user nova-compute[71605]: DEBUG nova.compute.manager [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Starting instance... {{(pid=71605) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG nova.policy [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1c8f57b12bc749888ea89bdbee258811', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '77f831070f5847bda788f6f0fcfedb03', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71605) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG nova.compute.manager [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Start spawning the instance on the hypervisor. {{(pid=71605) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Creating instance directory {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 16:03:47 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Creating image(s) Apr 20 16:03:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Acquiring lock "/opt/stack/data/nova/instances/e1036e0f-683f-4dfd-b0ad-6187d90ff2f6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "/opt/stack/data/nova/instances/e1036e0f-683f-4dfd-b0ad-6187d90ff2f6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "/opt/stack/data/nova/instances/e1036e0f-683f-4dfd-b0ad-6187d90ff2f6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71605) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 16:03:47 user nova-compute[71605]: INFO nova.compute.claims [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Claim successful on node user Apr 20 16:03:47 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.164s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Acquiring lock "4030659dc9e6940e4f224066d06e3784b1229890" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.003s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.168s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/e1036e0f-683f-4dfd-b0ad-6187d90ff2f6/disk 1073741824 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/e1036e0f-683f-4dfd-b0ad-6187d90ff2f6/disk 1073741824" returned: 0 in 0.064s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.238s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.154s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Checking if we can resize image /opt/stack/data/nova/instances/e1036e0f-683f-4dfd-b0ad-6187d90ff2f6/disk. size=1073741824 {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1036e0f-683f-4dfd-b0ad-6187d90ff2f6/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.662s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG nova.compute.manager [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Start building networks asynchronously for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1036e0f-683f-4dfd-b0ad-6187d90ff2f6/disk --force-share --output=json" returned: 0 in 0.181s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG nova.compute.manager [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Allocating IP information in the background. {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG nova.network.neutron [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] allocate_for_instance() {{(pid=71605) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Cannot resize image /opt/stack/data/nova/instances/e1036e0f-683f-4dfd-b0ad-6187d90ff2f6/disk to a smaller size. {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG nova.objects.instance [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lazy-loading 'migration_context' on Instance uuid e1036e0f-683f-4dfd-b0ad-6187d90ff2f6 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:03:47 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 16:03:47 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Created local disks {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Ensure instance console log exists: /opt/stack/data/nova/instances/e1036e0f-683f-4dfd-b0ad-6187d90ff2f6/console.log {{(pid=71605) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:47 user nova-compute[71605]: DEBUG nova.compute.manager [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Start building block device mappings for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG nova.compute.manager [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Start spawning the instance on the hypervisor. {{(pid=71605) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Creating instance directory {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 16:03:48 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Creating image(s) Apr 20 16:03:48 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Acquiring lock "/opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "/opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "/opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.003s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG nova.policy [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e51e637e06d1475692c4055ae99121da', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1d4a73ba128147f295bf6a4545fede47', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71605) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG nova.network.neutron [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Successfully created port: a0d0df58-0e84-4e27-bc44-3c5983d6d23b {{(pid=71605) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Acquiring lock "fe0bde76-a4f8-4865-91af-2bd3790587a7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.158s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Acquiring lock "4030659dc9e6940e4f224066d06e3784b1229890" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG nova.compute.manager [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Starting instance... {{(pid=71605) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.004s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.161s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/disk 1073741824 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71605) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 16:03:48 user nova-compute[71605]: INFO nova.compute.claims [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Claim successful on node user Apr 20 16:03:48 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/disk 1073741824" returned: 0 in 0.059s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.224s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.158s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Checking if we can resize image /opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/disk. size=1073741824 {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/disk --force-share --output=json" returned: 0 in 0.153s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Cannot resize image /opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/disk to a smaller size. {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG nova.objects.instance [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lazy-loading 'migration_context' on Instance uuid e8f62d46-e2dc-4870-adf1-f62d88bb653b {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Created local disks {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Ensure instance console log exists: /opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/console.log {{(pid=71605) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.560s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG nova.compute.manager [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Start building networks asynchronously for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG nova.compute.manager [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Allocating IP information in the background. {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 16:03:48 user nova-compute[71605]: DEBUG nova.network.neutron [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] allocate_for_instance() {{(pid=71605) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 16:03:48 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 16:03:49 user nova-compute[71605]: DEBUG nova.compute.manager [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Start building block device mappings for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 16:03:49 user nova-compute[71605]: DEBUG nova.compute.manager [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Start spawning the instance on the hypervisor. {{(pid=71605) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 16:03:49 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Creating instance directory {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 16:03:49 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Creating image(s) Apr 20 16:03:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Acquiring lock "/opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "/opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "/opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.007s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:49 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:49 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:49 user nova-compute[71605]: DEBUG nova.policy [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e51e637e06d1475692c4055ae99121da', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1d4a73ba128147f295bf6a4545fede47', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71605) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 16:03:49 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.160s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Acquiring lock "4030659dc9e6940e4f224066d06e3784b1229890" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:49 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:49 user nova-compute[71605]: DEBUG nova.network.neutron [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Successfully created port: 8200d42f-0f8f-439d-8ea8-1eea4fba54d6 {{(pid=71605) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 16:03:49 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.167s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:49 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk 1073741824 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:49 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk 1073741824" returned: 0 in 0.069s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.244s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:49 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:49 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.149s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:49 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Checking if we can resize image /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk. size=1073741824 {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 16:03:49 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:49 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk --force-share --output=json" returned: 0 in 0.160s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:49 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Cannot resize image /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk to a smaller size. {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 16:03:49 user nova-compute[71605]: DEBUG nova.objects.instance [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lazy-loading 'migration_context' on Instance uuid fe0bde76-a4f8-4865-91af-2bd3790587a7 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:03:49 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:49 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Created local disks {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 16:03:49 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Ensure instance console log exists: /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/console.log {{(pid=71605) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 16:03:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.network.neutron [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Successfully updated port: a0d0df58-0e84-4e27-bc44-3c5983d6d23b {{(pid=71605) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Acquiring lock "refresh_cache-e1036e0f-683f-4dfd-b0ad-6187d90ff2f6" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Acquired lock "refresh_cache-e1036e0f-683f-4dfd-b0ad-6187d90ff2f6" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.network.neutron [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Building network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.compute.manager [req-cd4311da-9da1-4b48-9f18-64a0b300d99f req-8ad95b84-040b-45a8-8292-cdf05a0684b3 service nova] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Received event network-changed-a0d0df58-0e84-4e27-bc44-3c5983d6d23b {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.compute.manager [req-cd4311da-9da1-4b48-9f18-64a0b300d99f req-8ad95b84-040b-45a8-8292-cdf05a0684b3 service nova] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Refreshing instance network info cache due to event network-changed-a0d0df58-0e84-4e27-bc44-3c5983d6d23b. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-cd4311da-9da1-4b48-9f18-64a0b300d99f req-8ad95b84-040b-45a8-8292-cdf05a0684b3 service nova] Acquiring lock "refresh_cache-e1036e0f-683f-4dfd-b0ad-6187d90ff2f6" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.network.neutron [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Instance cache missing network info. {{(pid=71605) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.network.neutron [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Updating instance_info_cache with network_info: [{"id": "a0d0df58-0e84-4e27-bc44-3c5983d6d23b", "address": "fa:16:3e:96:25:ea", "network": {"id": "27275346-fa92-4114-a62b-d59f0212eb8f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-871140467-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "77f831070f5847bda788f6f0fcfedb03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0d0df58-0e", "ovs_interfaceid": "a0d0df58-0e84-4e27-bc44-3c5983d6d23b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Releasing lock "refresh_cache-e1036e0f-683f-4dfd-b0ad-6187d90ff2f6" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.compute.manager [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Instance network_info: |[{"id": "a0d0df58-0e84-4e27-bc44-3c5983d6d23b", "address": "fa:16:3e:96:25:ea", "network": {"id": "27275346-fa92-4114-a62b-d59f0212eb8f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-871140467-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "77f831070f5847bda788f6f0fcfedb03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0d0df58-0e", "ovs_interfaceid": "a0d0df58-0e84-4e27-bc44-3c5983d6d23b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-cd4311da-9da1-4b48-9f18-64a0b300d99f req-8ad95b84-040b-45a8-8292-cdf05a0684b3 service nova] Acquired lock "refresh_cache-e1036e0f-683f-4dfd-b0ad-6187d90ff2f6" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.network.neutron [req-cd4311da-9da1-4b48-9f18-64a0b300d99f req-8ad95b84-040b-45a8-8292-cdf05a0684b3 service nova] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Refreshing network info cache for port a0d0df58-0e84-4e27-bc44-3c5983d6d23b {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Start _get_guest_xml network_info=[{"id": "a0d0df58-0e84-4e27-bc44-3c5983d6d23b", "address": "fa:16:3e:96:25:ea", "network": {"id": "27275346-fa92-4114-a62b-d59f0212eb8f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-871140467-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "77f831070f5847bda788f6f0fcfedb03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0d0df58-0e", "ovs_interfaceid": "a0d0df58-0e84-4e27-bc44-3c5983d6d23b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'size': 0, 'encryption_format': None, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'image_id': '4ac69ea5-e5d7-40c8-864e-0a164d78a727'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 16:03:50 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:03:50 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71605) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T16:00:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=), allow threads: True {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Flavor limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Image limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Flavor pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Image pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Got 1 possible topologies {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:03:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-943462612',display_name='tempest-AttachVolumeTestJSON-server-943462612',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-943462612',id=7,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLiDjegrCZQzggT2W88rJM8apM9Us8G90ElMugxXSgu6RWOdd7UNXIA5I2rSuifsaAIZ7hdjna3OuK6N+Oig2F4ghSuSm7pUTAMo6SzF09nfKRInfS2/IkPdA5ci5VCPuw==',key_name='tempest-keypair-2058378220',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='77f831070f5847bda788f6f0fcfedb03',ramdisk_id='',reservation_id='r-genr7j8b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-1838780462',owner_user_name='tempest-AttachVolumeTestJSON-1838780462-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:03:47Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1c8f57b12bc749888ea89bdbee258811',uuid=e1036e0f-683f-4dfd-b0ad-6187d90ff2f6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a0d0df58-0e84-4e27-bc44-3c5983d6d23b", "address": "fa:16:3e:96:25:ea", "network": {"id": "27275346-fa92-4114-a62b-d59f0212eb8f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-871140467-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "77f831070f5847bda788f6f0fcfedb03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0d0df58-0e", "ovs_interfaceid": "a0d0df58-0e84-4e27-bc44-3c5983d6d23b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71605) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Converting VIF {"id": "a0d0df58-0e84-4e27-bc44-3c5983d6d23b", "address": "fa:16:3e:96:25:ea", "network": {"id": "27275346-fa92-4114-a62b-d59f0212eb8f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-871140467-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "77f831070f5847bda788f6f0fcfedb03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0d0df58-0e", "ovs_interfaceid": "a0d0df58-0e84-4e27-bc44-3c5983d6d23b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:96:25:ea,bridge_name='br-int',has_traffic_filtering=True,id=a0d0df58-0e84-4e27-bc44-3c5983d6d23b,network=Network(27275346-fa92-4114-a62b-d59f0212eb8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0d0df58-0e') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.objects.instance [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lazy-loading 'pci_devices' on Instance uuid e1036e0f-683f-4dfd-b0ad-6187d90ff2f6 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] End _get_guest_xml xml= Apr 20 16:03:50 user nova-compute[71605]: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6 Apr 20 16:03:50 user nova-compute[71605]: instance-00000007 Apr 20 16:03:50 user nova-compute[71605]: 131072 Apr 20 16:03:50 user nova-compute[71605]: 1 Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: tempest-AttachVolumeTestJSON-server-943462612 Apr 20 16:03:50 user nova-compute[71605]: 2023-04-20 16:03:50 Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: 128 Apr 20 16:03:50 user nova-compute[71605]: 1 Apr 20 16:03:50 user nova-compute[71605]: 0 Apr 20 16:03:50 user nova-compute[71605]: 0 Apr 20 16:03:50 user nova-compute[71605]: 1 Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: tempest-AttachVolumeTestJSON-1838780462-project-member Apr 20 16:03:50 user nova-compute[71605]: tempest-AttachVolumeTestJSON-1838780462 Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: OpenStack Foundation Apr 20 16:03:50 user nova-compute[71605]: OpenStack Nova Apr 20 16:03:50 user nova-compute[71605]: 0.0.0 Apr 20 16:03:50 user nova-compute[71605]: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6 Apr 20 16:03:50 user nova-compute[71605]: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6 Apr 20 16:03:50 user nova-compute[71605]: Virtual Machine Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: hvm Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Nehalem Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: /dev/urandom Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: Apr 20 16:03:50 user nova-compute[71605]: {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:03:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-943462612',display_name='tempest-AttachVolumeTestJSON-server-943462612',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-943462612',id=7,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLiDjegrCZQzggT2W88rJM8apM9Us8G90ElMugxXSgu6RWOdd7UNXIA5I2rSuifsaAIZ7hdjna3OuK6N+Oig2F4ghSuSm7pUTAMo6SzF09nfKRInfS2/IkPdA5ci5VCPuw==',key_name='tempest-keypair-2058378220',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='77f831070f5847bda788f6f0fcfedb03',ramdisk_id='',reservation_id='r-genr7j8b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-1838780462',owner_user_name='tempest-AttachVolumeTestJSON-1838780462-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:03:47Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1c8f57b12bc749888ea89bdbee258811',uuid=e1036e0f-683f-4dfd-b0ad-6187d90ff2f6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a0d0df58-0e84-4e27-bc44-3c5983d6d23b", "address": "fa:16:3e:96:25:ea", "network": {"id": "27275346-fa92-4114-a62b-d59f0212eb8f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-871140467-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "77f831070f5847bda788f6f0fcfedb03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0d0df58-0e", "ovs_interfaceid": "a0d0df58-0e84-4e27-bc44-3c5983d6d23b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Converting VIF {"id": "a0d0df58-0e84-4e27-bc44-3c5983d6d23b", "address": "fa:16:3e:96:25:ea", "network": {"id": "27275346-fa92-4114-a62b-d59f0212eb8f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-871140467-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "77f831070f5847bda788f6f0fcfedb03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0d0df58-0e", "ovs_interfaceid": "a0d0df58-0e84-4e27-bc44-3c5983d6d23b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:96:25:ea,bridge_name='br-int',has_traffic_filtering=True,id=a0d0df58-0e84-4e27-bc44-3c5983d6d23b,network=Network(27275346-fa92-4114-a62b-d59f0212eb8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0d0df58-0e') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG os_vif [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:96:25:ea,bridge_name='br-int',has_traffic_filtering=True,id=a0d0df58-0e84-4e27-bc44-3c5983d6d23b,network=Network(27275346-fa92-4114-a62b-d59f0212eb8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0d0df58-0e') {{(pid=71605) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa0d0df58-0e, may_exist=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa0d0df58-0e, col_values=(('external_ids', {'iface-id': 'a0d0df58-0e84-4e27-bc44-3c5983d6d23b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:96:25:ea', 'vm-uuid': 'e1036e0f-683f-4dfd-b0ad-6187d90ff2f6'}),)) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:50 user nova-compute[71605]: INFO os_vif [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:96:25:ea,bridge_name='br-int',has_traffic_filtering=True,id=a0d0df58-0e84-4e27-bc44-3c5983d6d23b,network=Network(27275346-fa92-4114-a62b-d59f0212eb8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0d0df58-0e') Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] No BDM found with device name vda, not building metadata. {{(pid=71605) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 16:03:50 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] No VIF found with MAC fa:16:3e:96:25:ea, not building metadata {{(pid=71605) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 16:03:51 user nova-compute[71605]: DEBUG nova.network.neutron [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Successfully created port: 9f4d2191-16c0-4ab6-a4bd-f016499a9aad {{(pid=71605) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 16:03:52 user nova-compute[71605]: DEBUG nova.network.neutron [req-cd4311da-9da1-4b48-9f18-64a0b300d99f req-8ad95b84-040b-45a8-8292-cdf05a0684b3 service nova] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Updated VIF entry in instance network info cache for port a0d0df58-0e84-4e27-bc44-3c5983d6d23b. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:03:52 user nova-compute[71605]: DEBUG nova.network.neutron [req-cd4311da-9da1-4b48-9f18-64a0b300d99f req-8ad95b84-040b-45a8-8292-cdf05a0684b3 service nova] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Updating instance_info_cache with network_info: [{"id": "a0d0df58-0e84-4e27-bc44-3c5983d6d23b", "address": "fa:16:3e:96:25:ea", "network": {"id": "27275346-fa92-4114-a62b-d59f0212eb8f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-871140467-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "77f831070f5847bda788f6f0fcfedb03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0d0df58-0e", "ovs_interfaceid": "a0d0df58-0e84-4e27-bc44-3c5983d6d23b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:03:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-cd4311da-9da1-4b48-9f18-64a0b300d99f req-8ad95b84-040b-45a8-8292-cdf05a0684b3 service nova] Releasing lock "refresh_cache-e1036e0f-683f-4dfd-b0ad-6187d90ff2f6" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:03:52 user nova-compute[71605]: DEBUG nova.network.neutron [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Successfully updated port: 8200d42f-0f8f-439d-8ea8-1eea4fba54d6 {{(pid=71605) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 16:03:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Acquiring lock "refresh_cache-e8f62d46-e2dc-4870-adf1-f62d88bb653b" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:03:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Acquired lock "refresh_cache-e8f62d46-e2dc-4870-adf1-f62d88bb653b" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:03:52 user nova-compute[71605]: DEBUG nova.network.neutron [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Building network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 16:03:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:52 user nova-compute[71605]: DEBUG nova.compute.manager [req-fc9ba2a1-6ae8-468e-9718-2475df971122 req-3914632d-dad7-4c2b-ade3-7eb5a7ea506c service nova] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Received event network-changed-8200d42f-0f8f-439d-8ea8-1eea4fba54d6 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:03:52 user nova-compute[71605]: DEBUG nova.compute.manager [req-fc9ba2a1-6ae8-468e-9718-2475df971122 req-3914632d-dad7-4c2b-ade3-7eb5a7ea506c service nova] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Refreshing instance network info cache due to event network-changed-8200d42f-0f8f-439d-8ea8-1eea4fba54d6. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:03:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-fc9ba2a1-6ae8-468e-9718-2475df971122 req-3914632d-dad7-4c2b-ade3-7eb5a7ea506c service nova] Acquiring lock "refresh_cache-e8f62d46-e2dc-4870-adf1-f62d88bb653b" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:03:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:52 user nova-compute[71605]: DEBUG nova.network.neutron [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Instance cache missing network info. {{(pid=71605) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 16:03:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG nova.network.neutron [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Updating instance_info_cache with network_info: [{"id": "8200d42f-0f8f-439d-8ea8-1eea4fba54d6", "address": "fa:16:3e:6d:26:c0", "network": {"id": "4b5db782-8dbb-4f06-8e98-a794013dbc8c", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1330432693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d4a73ba128147f295bf6a4545fede47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8200d42f-0f", "ovs_interfaceid": "8200d42f-0f8f-439d-8ea8-1eea4fba54d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Releasing lock "refresh_cache-e8f62d46-e2dc-4870-adf1-f62d88bb653b" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG nova.compute.manager [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Instance network_info: |[{"id": "8200d42f-0f8f-439d-8ea8-1eea4fba54d6", "address": "fa:16:3e:6d:26:c0", "network": {"id": "4b5db782-8dbb-4f06-8e98-a794013dbc8c", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1330432693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d4a73ba128147f295bf6a4545fede47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8200d42f-0f", "ovs_interfaceid": "8200d42f-0f8f-439d-8ea8-1eea4fba54d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-fc9ba2a1-6ae8-468e-9718-2475df971122 req-3914632d-dad7-4c2b-ade3-7eb5a7ea506c service nova] Acquired lock "refresh_cache-e8f62d46-e2dc-4870-adf1-f62d88bb653b" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG nova.network.neutron [req-fc9ba2a1-6ae8-468e-9718-2475df971122 req-3914632d-dad7-4c2b-ade3-7eb5a7ea506c service nova] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Refreshing network info cache for port 8200d42f-0f8f-439d-8ea8-1eea4fba54d6 {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Start _get_guest_xml network_info=[{"id": "8200d42f-0f8f-439d-8ea8-1eea4fba54d6", "address": "fa:16:3e:6d:26:c0", "network": {"id": "4b5db782-8dbb-4f06-8e98-a794013dbc8c", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1330432693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d4a73ba128147f295bf6a4545fede47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8200d42f-0f", "ovs_interfaceid": "8200d42f-0f8f-439d-8ea8-1eea4fba54d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'size': 0, 'encryption_format': None, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'image_id': '4ac69ea5-e5d7-40c8-864e-0a164d78a727'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 16:03:53 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:03:53 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:03:53 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71605) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T16:00:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=), allow threads: True {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Flavor limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Image limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Flavor pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Image pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Got 1 possible topologies {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:03:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-2146872314',display_name='tempest-ServerRescueNegativeTestJSON-server-2146872314',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-2146872314',id=8,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1d4a73ba128147f295bf6a4545fede47',ramdisk_id='',reservation_id='r-oxzlvpr2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-237285916',owner_user_name='tempest-ServerRescueNegativeTestJSON-237285916-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:03:48Z,user_data=None,user_id='e51e637e06d1475692c4055ae99121da',uuid=e8f62d46-e2dc-4870-adf1-f62d88bb653b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8200d42f-0f8f-439d-8ea8-1eea4fba54d6", "address": "fa:16:3e:6d:26:c0", "network": {"id": "4b5db782-8dbb-4f06-8e98-a794013dbc8c", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1330432693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d4a73ba128147f295bf6a4545fede47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8200d42f-0f", "ovs_interfaceid": "8200d42f-0f8f-439d-8ea8-1eea4fba54d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71605) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Converting VIF {"id": "8200d42f-0f8f-439d-8ea8-1eea4fba54d6", "address": "fa:16:3e:6d:26:c0", "network": {"id": "4b5db782-8dbb-4f06-8e98-a794013dbc8c", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1330432693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d4a73ba128147f295bf6a4545fede47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8200d42f-0f", "ovs_interfaceid": "8200d42f-0f8f-439d-8ea8-1eea4fba54d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:26:c0,bridge_name='br-int',has_traffic_filtering=True,id=8200d42f-0f8f-439d-8ea8-1eea4fba54d6,network=Network(4b5db782-8dbb-4f06-8e98-a794013dbc8c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8200d42f-0f') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG nova.objects.instance [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lazy-loading 'pci_devices' on Instance uuid e8f62d46-e2dc-4870-adf1-f62d88bb653b {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] End _get_guest_xml xml= Apr 20 16:03:53 user nova-compute[71605]: e8f62d46-e2dc-4870-adf1-f62d88bb653b Apr 20 16:03:53 user nova-compute[71605]: instance-00000008 Apr 20 16:03:53 user nova-compute[71605]: 131072 Apr 20 16:03:53 user nova-compute[71605]: 1 Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: tempest-ServerRescueNegativeTestJSON-server-2146872314 Apr 20 16:03:53 user nova-compute[71605]: 2023-04-20 16:03:53 Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: 128 Apr 20 16:03:53 user nova-compute[71605]: 1 Apr 20 16:03:53 user nova-compute[71605]: 0 Apr 20 16:03:53 user nova-compute[71605]: 0 Apr 20 16:03:53 user nova-compute[71605]: 1 Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: tempest-ServerRescueNegativeTestJSON-237285916-project-member Apr 20 16:03:53 user nova-compute[71605]: tempest-ServerRescueNegativeTestJSON-237285916 Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: OpenStack Foundation Apr 20 16:03:53 user nova-compute[71605]: OpenStack Nova Apr 20 16:03:53 user nova-compute[71605]: 0.0.0 Apr 20 16:03:53 user nova-compute[71605]: e8f62d46-e2dc-4870-adf1-f62d88bb653b Apr 20 16:03:53 user nova-compute[71605]: e8f62d46-e2dc-4870-adf1-f62d88bb653b Apr 20 16:03:53 user nova-compute[71605]: Virtual Machine Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: hvm Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Nehalem Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: /dev/urandom Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: Apr 20 16:03:53 user nova-compute[71605]: {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:03:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-2146872314',display_name='tempest-ServerRescueNegativeTestJSON-server-2146872314',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-2146872314',id=8,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1d4a73ba128147f295bf6a4545fede47',ramdisk_id='',reservation_id='r-oxzlvpr2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-237285916',owner_user_name='tempest-ServerRescueNegativeTestJSON-237285916-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:03:48Z,user_data=None,user_id='e51e637e06d1475692c4055ae99121da',uuid=e8f62d46-e2dc-4870-adf1-f62d88bb653b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8200d42f-0f8f-439d-8ea8-1eea4fba54d6", "address": "fa:16:3e:6d:26:c0", "network": {"id": "4b5db782-8dbb-4f06-8e98-a794013dbc8c", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1330432693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d4a73ba128147f295bf6a4545fede47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8200d42f-0f", "ovs_interfaceid": "8200d42f-0f8f-439d-8ea8-1eea4fba54d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Converting VIF {"id": "8200d42f-0f8f-439d-8ea8-1eea4fba54d6", "address": "fa:16:3e:6d:26:c0", "network": {"id": "4b5db782-8dbb-4f06-8e98-a794013dbc8c", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1330432693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d4a73ba128147f295bf6a4545fede47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8200d42f-0f", "ovs_interfaceid": "8200d42f-0f8f-439d-8ea8-1eea4fba54d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:26:c0,bridge_name='br-int',has_traffic_filtering=True,id=8200d42f-0f8f-439d-8ea8-1eea4fba54d6,network=Network(4b5db782-8dbb-4f06-8e98-a794013dbc8c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8200d42f-0f') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG os_vif [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:26:c0,bridge_name='br-int',has_traffic_filtering=True,id=8200d42f-0f8f-439d-8ea8-1eea4fba54d6,network=Network(4b5db782-8dbb-4f06-8e98-a794013dbc8c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8200d42f-0f') {{(pid=71605) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8200d42f-0f, may_exist=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8200d42f-0f, col_values=(('external_ids', {'iface-id': '8200d42f-0f8f-439d-8ea8-1eea4fba54d6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6d:26:c0', 'vm-uuid': 'e8f62d46-e2dc-4870-adf1-f62d88bb653b'}),)) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:53 user nova-compute[71605]: INFO os_vif [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:26:c0,bridge_name='br-int',has_traffic_filtering=True,id=8200d42f-0f8f-439d-8ea8-1eea4fba54d6,network=Network(4b5db782-8dbb-4f06-8e98-a794013dbc8c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8200d42f-0f') Apr 20 16:03:53 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] No BDM found with device name vda, not building metadata. {{(pid=71605) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] No VIF found with MAC fa:16:3e:6d:26:c0, not building metadata {{(pid=71605) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG nova.network.neutron [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Successfully updated port: 9f4d2191-16c0-4ab6-a4bd-f016499a9aad {{(pid=71605) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Acquiring lock "refresh_cache-fe0bde76-a4f8-4865-91af-2bd3790587a7" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Acquired lock "refresh_cache-fe0bde76-a4f8-4865-91af-2bd3790587a7" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG nova.network.neutron [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Building network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 16:03:53 user nova-compute[71605]: DEBUG nova.network.neutron [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Instance cache missing network info. {{(pid=71605) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.network.neutron [req-fc9ba2a1-6ae8-468e-9718-2475df971122 req-3914632d-dad7-4c2b-ade3-7eb5a7ea506c service nova] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Updated VIF entry in instance network info cache for port 8200d42f-0f8f-439d-8ea8-1eea4fba54d6. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.network.neutron [req-fc9ba2a1-6ae8-468e-9718-2475df971122 req-3914632d-dad7-4c2b-ade3-7eb5a7ea506c service nova] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Updating instance_info_cache with network_info: [{"id": "8200d42f-0f8f-439d-8ea8-1eea4fba54d6", "address": "fa:16:3e:6d:26:c0", "network": {"id": "4b5db782-8dbb-4f06-8e98-a794013dbc8c", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1330432693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d4a73ba128147f295bf6a4545fede47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8200d42f-0f", "ovs_interfaceid": "8200d42f-0f8f-439d-8ea8-1eea4fba54d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.compute.manager [req-f68a2da9-a93e-4f81-b05a-aecd9e589bdf req-e7717916-a61f-4669-958a-91e45e9e4c26 service nova] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Received event network-vif-plugged-a0d0df58-0e84-4e27-bc44-3c5983d6d23b {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-f68a2da9-a93e-4f81-b05a-aecd9e589bdf req-e7717916-a61f-4669-958a-91e45e9e4c26 service nova] Acquiring lock "e1036e0f-683f-4dfd-b0ad-6187d90ff2f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-f68a2da9-a93e-4f81-b05a-aecd9e589bdf req-e7717916-a61f-4669-958a-91e45e9e4c26 service nova] Lock "e1036e0f-683f-4dfd-b0ad-6187d90ff2f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-f68a2da9-a93e-4f81-b05a-aecd9e589bdf req-e7717916-a61f-4669-958a-91e45e9e4c26 service nova] Lock "e1036e0f-683f-4dfd-b0ad-6187d90ff2f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.compute.manager [req-f68a2da9-a93e-4f81-b05a-aecd9e589bdf req-e7717916-a61f-4669-958a-91e45e9e4c26 service nova] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] No waiting events found dispatching network-vif-plugged-a0d0df58-0e84-4e27-bc44-3c5983d6d23b {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:03:54 user nova-compute[71605]: WARNING nova.compute.manager [req-f68a2da9-a93e-4f81-b05a-aecd9e589bdf req-e7717916-a61f-4669-958a-91e45e9e4c26 service nova] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Received unexpected event network-vif-plugged-a0d0df58-0e84-4e27-bc44-3c5983d6d23b for instance with vm_state building and task_state spawning. Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.compute.manager [req-f68a2da9-a93e-4f81-b05a-aecd9e589bdf req-e7717916-a61f-4669-958a-91e45e9e4c26 service nova] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Received event network-vif-plugged-a0d0df58-0e84-4e27-bc44-3c5983d6d23b {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-f68a2da9-a93e-4f81-b05a-aecd9e589bdf req-e7717916-a61f-4669-958a-91e45e9e4c26 service nova] Acquiring lock "e1036e0f-683f-4dfd-b0ad-6187d90ff2f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-f68a2da9-a93e-4f81-b05a-aecd9e589bdf req-e7717916-a61f-4669-958a-91e45e9e4c26 service nova] Lock "e1036e0f-683f-4dfd-b0ad-6187d90ff2f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-f68a2da9-a93e-4f81-b05a-aecd9e589bdf req-e7717916-a61f-4669-958a-91e45e9e4c26 service nova] Lock "e1036e0f-683f-4dfd-b0ad-6187d90ff2f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.compute.manager [req-f68a2da9-a93e-4f81-b05a-aecd9e589bdf req-e7717916-a61f-4669-958a-91e45e9e4c26 service nova] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] No waiting events found dispatching network-vif-plugged-a0d0df58-0e84-4e27-bc44-3c5983d6d23b {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:03:54 user nova-compute[71605]: WARNING nova.compute.manager [req-f68a2da9-a93e-4f81-b05a-aecd9e589bdf req-e7717916-a61f-4669-958a-91e45e9e4c26 service nova] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Received unexpected event network-vif-plugged-a0d0df58-0e84-4e27-bc44-3c5983d6d23b for instance with vm_state building and task_state spawning. Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.compute.manager [req-f68a2da9-a93e-4f81-b05a-aecd9e589bdf req-e7717916-a61f-4669-958a-91e45e9e4c26 service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Received event network-changed-9f4d2191-16c0-4ab6-a4bd-f016499a9aad {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.compute.manager [req-f68a2da9-a93e-4f81-b05a-aecd9e589bdf req-e7717916-a61f-4669-958a-91e45e9e4c26 service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Refreshing instance network info cache due to event network-changed-9f4d2191-16c0-4ab6-a4bd-f016499a9aad. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-f68a2da9-a93e-4f81-b05a-aecd9e589bdf req-e7717916-a61f-4669-958a-91e45e9e4c26 service nova] Acquiring lock "refresh_cache-fe0bde76-a4f8-4865-91af-2bd3790587a7" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-fc9ba2a1-6ae8-468e-9718-2475df971122 req-3914632d-dad7-4c2b-ade3-7eb5a7ea506c service nova] Releasing lock "refresh_cache-e8f62d46-e2dc-4870-adf1-f62d88bb653b" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.network.neutron [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Updating instance_info_cache with network_info: [{"id": "9f4d2191-16c0-4ab6-a4bd-f016499a9aad", "address": "fa:16:3e:dd:52:dd", "network": {"id": "4b5db782-8dbb-4f06-8e98-a794013dbc8c", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1330432693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d4a73ba128147f295bf6a4545fede47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f4d2191-16", "ovs_interfaceid": "9f4d2191-16c0-4ab6-a4bd-f016499a9aad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Releasing lock "refresh_cache-fe0bde76-a4f8-4865-91af-2bd3790587a7" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Instance network_info: |[{"id": "9f4d2191-16c0-4ab6-a4bd-f016499a9aad", "address": "fa:16:3e:dd:52:dd", "network": {"id": "4b5db782-8dbb-4f06-8e98-a794013dbc8c", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1330432693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d4a73ba128147f295bf6a4545fede47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f4d2191-16", "ovs_interfaceid": "9f4d2191-16c0-4ab6-a4bd-f016499a9aad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-f68a2da9-a93e-4f81-b05a-aecd9e589bdf req-e7717916-a61f-4669-958a-91e45e9e4c26 service nova] Acquired lock "refresh_cache-fe0bde76-a4f8-4865-91af-2bd3790587a7" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.network.neutron [req-f68a2da9-a93e-4f81-b05a-aecd9e589bdf req-e7717916-a61f-4669-958a-91e45e9e4c26 service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Refreshing network info cache for port 9f4d2191-16c0-4ab6-a4bd-f016499a9aad {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Start _get_guest_xml network_info=[{"id": "9f4d2191-16c0-4ab6-a4bd-f016499a9aad", "address": "fa:16:3e:dd:52:dd", "network": {"id": "4b5db782-8dbb-4f06-8e98-a794013dbc8c", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1330432693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d4a73ba128147f295bf6a4545fede47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f4d2191-16", "ovs_interfaceid": "9f4d2191-16c0-4ab6-a4bd-f016499a9aad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'size': 0, 'encryption_format': None, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'image_id': '4ac69ea5-e5d7-40c8-864e-0a164d78a727'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Acquiring lock "dc918ed4-8bc6-4a4f-a189-d6cdd5817854" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "dc918ed4-8bc6-4a4f-a189-d6cdd5817854" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:54 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:03:54 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71605) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T16:00:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=), allow threads: True {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Flavor limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Image limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Flavor pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Image pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Got 1 possible topologies {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:03:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1422609663',display_name='tempest-ServerRescueNegativeTestJSON-server-1422609663',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1422609663',id=9,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1d4a73ba128147f295bf6a4545fede47',ramdisk_id='',reservation_id='r-0526y6jk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-237285916',owner_user_name='tempest-ServerRescueNegativeTestJSON-237285916-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:03:49Z,user_data=None,user_id='e51e637e06d1475692c4055ae99121da',uuid=fe0bde76-a4f8-4865-91af-2bd3790587a7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9f4d2191-16c0-4ab6-a4bd-f016499a9aad", "address": "fa:16:3e:dd:52:dd", "network": {"id": "4b5db782-8dbb-4f06-8e98-a794013dbc8c", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1330432693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d4a73ba128147f295bf6a4545fede47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f4d2191-16", "ovs_interfaceid": "9f4d2191-16c0-4ab6-a4bd-f016499a9aad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71605) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Converting VIF {"id": "9f4d2191-16c0-4ab6-a4bd-f016499a9aad", "address": "fa:16:3e:dd:52:dd", "network": {"id": "4b5db782-8dbb-4f06-8e98-a794013dbc8c", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1330432693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d4a73ba128147f295bf6a4545fede47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f4d2191-16", "ovs_interfaceid": "9f4d2191-16c0-4ab6-a4bd-f016499a9aad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:52:dd,bridge_name='br-int',has_traffic_filtering=True,id=9f4d2191-16c0-4ab6-a4bd-f016499a9aad,network=Network(4b5db782-8dbb-4f06-8e98-a794013dbc8c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f4d2191-16') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.objects.instance [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lazy-loading 'pci_devices' on Instance uuid fe0bde76-a4f8-4865-91af-2bd3790587a7 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Starting instance... {{(pid=71605) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] End _get_guest_xml xml= Apr 20 16:03:54 user nova-compute[71605]: fe0bde76-a4f8-4865-91af-2bd3790587a7 Apr 20 16:03:54 user nova-compute[71605]: instance-00000009 Apr 20 16:03:54 user nova-compute[71605]: 131072 Apr 20 16:03:54 user nova-compute[71605]: 1 Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: tempest-ServerRescueNegativeTestJSON-server-1422609663 Apr 20 16:03:54 user nova-compute[71605]: 2023-04-20 16:03:54 Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: 128 Apr 20 16:03:54 user nova-compute[71605]: 1 Apr 20 16:03:54 user nova-compute[71605]: 0 Apr 20 16:03:54 user nova-compute[71605]: 0 Apr 20 16:03:54 user nova-compute[71605]: 1 Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: tempest-ServerRescueNegativeTestJSON-237285916-project-member Apr 20 16:03:54 user nova-compute[71605]: tempest-ServerRescueNegativeTestJSON-237285916 Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: OpenStack Foundation Apr 20 16:03:54 user nova-compute[71605]: OpenStack Nova Apr 20 16:03:54 user nova-compute[71605]: 0.0.0 Apr 20 16:03:54 user nova-compute[71605]: fe0bde76-a4f8-4865-91af-2bd3790587a7 Apr 20 16:03:54 user nova-compute[71605]: fe0bde76-a4f8-4865-91af-2bd3790587a7 Apr 20 16:03:54 user nova-compute[71605]: Virtual Machine Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: hvm Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Nehalem Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: /dev/urandom Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: Apr 20 16:03:54 user nova-compute[71605]: {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:03:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1422609663',display_name='tempest-ServerRescueNegativeTestJSON-server-1422609663',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1422609663',id=9,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1d4a73ba128147f295bf6a4545fede47',ramdisk_id='',reservation_id='r-0526y6jk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-237285916',owner_user_name='tempest-ServerRescueNegativeTestJSON-237285916-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:03:49Z,user_data=None,user_id='e51e637e06d1475692c4055ae99121da',uuid=fe0bde76-a4f8-4865-91af-2bd3790587a7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9f4d2191-16c0-4ab6-a4bd-f016499a9aad", "address": "fa:16:3e:dd:52:dd", "network": {"id": "4b5db782-8dbb-4f06-8e98-a794013dbc8c", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1330432693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d4a73ba128147f295bf6a4545fede47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f4d2191-16", "ovs_interfaceid": "9f4d2191-16c0-4ab6-a4bd-f016499a9aad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Converting VIF {"id": "9f4d2191-16c0-4ab6-a4bd-f016499a9aad", "address": "fa:16:3e:dd:52:dd", "network": {"id": "4b5db782-8dbb-4f06-8e98-a794013dbc8c", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1330432693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d4a73ba128147f295bf6a4545fede47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f4d2191-16", "ovs_interfaceid": "9f4d2191-16c0-4ab6-a4bd-f016499a9aad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:52:dd,bridge_name='br-int',has_traffic_filtering=True,id=9f4d2191-16c0-4ab6-a4bd-f016499a9aad,network=Network(4b5db782-8dbb-4f06-8e98-a794013dbc8c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f4d2191-16') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG os_vif [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:52:dd,bridge_name='br-int',has_traffic_filtering=True,id=9f4d2191-16c0-4ab6-a4bd-f016499a9aad,network=Network(4b5db782-8dbb-4f06-8e98-a794013dbc8c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f4d2191-16') {{(pid=71605) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9f4d2191-16, may_exist=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9f4d2191-16, col_values=(('external_ids', {'iface-id': '9f4d2191-16c0-4ab6-a4bd-f016499a9aad', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dd:52:dd', 'vm-uuid': 'fe0bde76-a4f8-4865-91af-2bd3790587a7'}),)) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Resumed> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:03:54 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] VM Resumed (Lifecycle Event) Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Instance event wait completed in 0 seconds for {{(pid=71605) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Guest created on hypervisor {{(pid=71605) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 16:03:54 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Instance spawned successfully. Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:54 user nova-compute[71605]: INFO os_vif [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:52:dd,bridge_name='br-int',has_traffic_filtering=True,id=9f4d2191-16c0-4ab6-a4bd-f016499a9aad,network=Network(4b5db782-8dbb-4f06-8e98-a794013dbc8c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f4d2191-16') Apr 20 16:03:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.004s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Found default for hw_cdrom_bus of ide {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Found default for hw_disk_bus of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Found default for hw_input_bus of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Found default for hw_pointer_model of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Found default for hw_video_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Found default for hw_vif_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:54 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Started> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:03:54 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] VM Started (Lifecycle Event) Apr 20 16:03:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71605) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 16:03:54 user nova-compute[71605]: INFO nova.compute.claims [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Claim successful on node user Apr 20 16:03:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] No BDM found with device name vda, not building metadata. {{(pid=71605) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] No VIF found with MAC fa:16:3e:dd:52:dd, not building metadata {{(pid=71605) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:03:54 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:03:54 user nova-compute[71605]: INFO nova.compute.manager [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Took 7.90 seconds to spawn the instance on the hypervisor. Apr 20 16:03:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:03:55 user nova-compute[71605]: DEBUG nova.compute.manager [req-9e14682d-3a59-4cd8-9125-d1fcd74bdf7a req-3b4f6cc0-7bc8-4903-a2b8-bcfe19d66b32 service nova] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Received event network-vif-plugged-8200d42f-0f8f-439d-8ea8-1eea4fba54d6 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:03:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-9e14682d-3a59-4cd8-9125-d1fcd74bdf7a req-3b4f6cc0-7bc8-4903-a2b8-bcfe19d66b32 service nova] Acquiring lock "e8f62d46-e2dc-4870-adf1-f62d88bb653b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-9e14682d-3a59-4cd8-9125-d1fcd74bdf7a req-3b4f6cc0-7bc8-4903-a2b8-bcfe19d66b32 service nova] Lock "e8f62d46-e2dc-4870-adf1-f62d88bb653b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-9e14682d-3a59-4cd8-9125-d1fcd74bdf7a req-3b4f6cc0-7bc8-4903-a2b8-bcfe19d66b32 service nova] Lock "e8f62d46-e2dc-4870-adf1-f62d88bb653b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:55 user nova-compute[71605]: DEBUG nova.compute.manager [req-9e14682d-3a59-4cd8-9125-d1fcd74bdf7a req-3b4f6cc0-7bc8-4903-a2b8-bcfe19d66b32 service nova] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] No waiting events found dispatching network-vif-plugged-8200d42f-0f8f-439d-8ea8-1eea4fba54d6 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:03:55 user nova-compute[71605]: WARNING nova.compute.manager [req-9e14682d-3a59-4cd8-9125-d1fcd74bdf7a req-3b4f6cc0-7bc8-4903-a2b8-bcfe19d66b32 service nova] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Received unexpected event network-vif-plugged-8200d42f-0f8f-439d-8ea8-1eea4fba54d6 for instance with vm_state building and task_state spawning. Apr 20 16:03:55 user nova-compute[71605]: INFO nova.compute.manager [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Took 9.19 seconds to build instance. Apr 20 16:03:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-46b31143-3bfb-48f5-97d6-d7cd760e13f0 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "e1036e0f-683f-4dfd-b0ad-6187d90ff2f6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 9.321s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:55 user nova-compute[71605]: DEBUG nova.network.neutron [req-f68a2da9-a93e-4f81-b05a-aecd9e589bdf req-e7717916-a61f-4669-958a-91e45e9e4c26 service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Updated VIF entry in instance network info cache for port 9f4d2191-16c0-4ab6-a4bd-f016499a9aad. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:03:55 user nova-compute[71605]: DEBUG nova.network.neutron [req-f68a2da9-a93e-4f81-b05a-aecd9e589bdf req-e7717916-a61f-4669-958a-91e45e9e4c26 service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Updating instance_info_cache with network_info: [{"id": "9f4d2191-16c0-4ab6-a4bd-f016499a9aad", "address": "fa:16:3e:dd:52:dd", "network": {"id": "4b5db782-8dbb-4f06-8e98-a794013dbc8c", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1330432693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d4a73ba128147f295bf6a4545fede47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f4d2191-16", "ovs_interfaceid": "9f4d2191-16c0-4ab6-a4bd-f016499a9aad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:03:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-f68a2da9-a93e-4f81-b05a-aecd9e589bdf req-e7717916-a61f-4669-958a-91e45e9e4c26 service nova] Releasing lock "refresh_cache-fe0bde76-a4f8-4865-91af-2bd3790587a7" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:03:55 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:03:55 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:03:55 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:55 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.797s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:55 user nova-compute[71605]: DEBUG nova.compute.manager [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Start building networks asynchronously for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 16:03:55 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:55 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:55 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:55 user nova-compute[71605]: DEBUG nova.compute.manager [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Allocating IP information in the background. {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 16:03:55 user nova-compute[71605]: DEBUG nova.network.neutron [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] allocate_for_instance() {{(pid=71605) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 16:03:55 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 16:03:55 user nova-compute[71605]: DEBUG nova.compute.manager [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Start building block device mappings for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 16:03:55 user nova-compute[71605]: DEBUG nova.policy [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c92692a1d38b4531a4e7f42660a54c7b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a92cea9e1182477ca669c506b42eda60', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71605) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 16:03:55 user nova-compute[71605]: DEBUG nova.compute.manager [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Start spawning the instance on the hypervisor. {{(pid=71605) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 16:03:55 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Creating instance directory {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 16:03:55 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Creating image(s) Apr 20 16:03:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Acquiring lock "/opt/stack/data/nova/instances/dc918ed4-8bc6-4a4f-a189-d6cdd5817854/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "/opt/stack/data/nova/instances/dc918ed4-8bc6-4a4f-a189-d6cdd5817854/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "/opt/stack/data/nova/instances/dc918ed4-8bc6-4a4f-a189-d6cdd5817854/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:55 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.167s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Acquiring lock "4030659dc9e6940e4f224066d06e3784b1229890" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.159s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/dc918ed4-8bc6-4a4f-a189-d6cdd5817854/disk 1073741824 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/dc918ed4-8bc6-4a4f-a189-d6cdd5817854/disk 1073741824" returned: 0 in 0.061s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.228s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.182s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:56 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Checking if we can resize image /opt/stack/data/nova/instances/dc918ed4-8bc6-4a4f-a189-d6cdd5817854/disk. size=1073741824 {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 16:03:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dc918ed4-8bc6-4a4f-a189-d6cdd5817854/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:03:56 user nova-compute[71605]: DEBUG nova.network.neutron [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Successfully created port: 74703b46-6b03-4752-953b-9c64a63249c8 {{(pid=71605) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 16:03:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dc918ed4-8bc6-4a4f-a189-d6cdd5817854/disk --force-share --output=json" returned: 0 in 0.162s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:03:56 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Cannot resize image /opt/stack/data/nova/instances/dc918ed4-8bc6-4a4f-a189-d6cdd5817854/disk to a smaller size. {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 16:03:56 user nova-compute[71605]: DEBUG nova.objects.instance [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lazy-loading 'migration_context' on Instance uuid dc918ed4-8bc6-4a4f-a189-d6cdd5817854 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:03:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Created local disks {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 16:03:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Ensure instance console log exists: /opt/stack/data/nova/instances/dc918ed4-8bc6-4a4f-a189-d6cdd5817854/console.log {{(pid=71605) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 16:03:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:57 user nova-compute[71605]: DEBUG nova.compute.manager [req-a973ffdd-407e-4309-9ce5-5cf072b6dd3a req-e9f79db4-b0da-4e5e-a31a-4c5d55f6788c service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Received event network-vif-plugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:03:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-a973ffdd-407e-4309-9ce5-5cf072b6dd3a req-e9f79db4-b0da-4e5e-a31a-4c5d55f6788c service nova] Acquiring lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-a973ffdd-407e-4309-9ce5-5cf072b6dd3a req-e9f79db4-b0da-4e5e-a31a-4c5d55f6788c service nova] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-a973ffdd-407e-4309-9ce5-5cf072b6dd3a req-e9f79db4-b0da-4e5e-a31a-4c5d55f6788c service nova] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:57 user nova-compute[71605]: DEBUG nova.compute.manager [req-a973ffdd-407e-4309-9ce5-5cf072b6dd3a req-e9f79db4-b0da-4e5e-a31a-4c5d55f6788c service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] No waiting events found dispatching network-vif-plugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:03:57 user nova-compute[71605]: WARNING nova.compute.manager [req-a973ffdd-407e-4309-9ce5-5cf072b6dd3a req-e9f79db4-b0da-4e5e-a31a-4c5d55f6788c service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Received unexpected event network-vif-plugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad for instance with vm_state building and task_state spawning. Apr 20 16:03:57 user nova-compute[71605]: DEBUG nova.compute.manager [req-9808404b-aa74-48e8-a446-4ad87cbb993a req-8facf5c7-205e-4d65-ac3b-f5178bc733a3 service nova] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Received event network-vif-plugged-8200d42f-0f8f-439d-8ea8-1eea4fba54d6 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:03:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-9808404b-aa74-48e8-a446-4ad87cbb993a req-8facf5c7-205e-4d65-ac3b-f5178bc733a3 service nova] Acquiring lock "e8f62d46-e2dc-4870-adf1-f62d88bb653b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-9808404b-aa74-48e8-a446-4ad87cbb993a req-8facf5c7-205e-4d65-ac3b-f5178bc733a3 service nova] Lock "e8f62d46-e2dc-4870-adf1-f62d88bb653b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.005s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-9808404b-aa74-48e8-a446-4ad87cbb993a req-8facf5c7-205e-4d65-ac3b-f5178bc733a3 service nova] Lock "e8f62d46-e2dc-4870-adf1-f62d88bb653b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:57 user nova-compute[71605]: DEBUG nova.compute.manager [req-9808404b-aa74-48e8-a446-4ad87cbb993a req-8facf5c7-205e-4d65-ac3b-f5178bc733a3 service nova] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] No waiting events found dispatching network-vif-plugged-8200d42f-0f8f-439d-8ea8-1eea4fba54d6 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:03:57 user nova-compute[71605]: WARNING nova.compute.manager [req-9808404b-aa74-48e8-a446-4ad87cbb993a req-8facf5c7-205e-4d65-ac3b-f5178bc733a3 service nova] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Received unexpected event network-vif-plugged-8200d42f-0f8f-439d-8ea8-1eea4fba54d6 for instance with vm_state building and task_state spawning. Apr 20 16:03:57 user nova-compute[71605]: DEBUG nova.network.neutron [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Successfully updated port: 74703b46-6b03-4752-953b-9c64a63249c8 {{(pid=71605) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 16:03:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Acquiring lock "refresh_cache-dc918ed4-8bc6-4a4f-a189-d6cdd5817854" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:03:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Acquired lock "refresh_cache-dc918ed4-8bc6-4a4f-a189-d6cdd5817854" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:03:57 user nova-compute[71605]: DEBUG nova.network.neutron [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Building network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 16:03:57 user nova-compute[71605]: DEBUG nova.network.neutron [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Instance cache missing network info. {{(pid=71605) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.network.neutron [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Updating instance_info_cache with network_info: [{"id": "74703b46-6b03-4752-953b-9c64a63249c8", "address": "fa:16:3e:c5:94:d0", "network": {"id": "40132b20-6bfd-4f5a-8f6f-75769961d157", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-683065417-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a92cea9e1182477ca669c506b42eda60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap74703b46-6b", "ovs_interfaceid": "74703b46-6b03-4752-953b-9c64a63249c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Releasing lock "refresh_cache-dc918ed4-8bc6-4a4f-a189-d6cdd5817854" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.compute.manager [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Instance network_info: |[{"id": "74703b46-6b03-4752-953b-9c64a63249c8", "address": "fa:16:3e:c5:94:d0", "network": {"id": "40132b20-6bfd-4f5a-8f6f-75769961d157", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-683065417-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a92cea9e1182477ca669c506b42eda60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap74703b46-6b", "ovs_interfaceid": "74703b46-6b03-4752-953b-9c64a63249c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Start _get_guest_xml network_info=[{"id": "74703b46-6b03-4752-953b-9c64a63249c8", "address": "fa:16:3e:c5:94:d0", "network": {"id": "40132b20-6bfd-4f5a-8f6f-75769961d157", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-683065417-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a92cea9e1182477ca669c506b42eda60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap74703b46-6b", "ovs_interfaceid": "74703b46-6b03-4752-953b-9c64a63249c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'size': 0, 'encryption_format': None, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'image_id': '4ac69ea5-e5d7-40c8-864e-0a164d78a727'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 16:03:58 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:03:58 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71605) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T16:00:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=), allow threads: True {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Flavor limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Image limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Flavor pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Image pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Got 1 possible topologies {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:03:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-247130899',display_name='tempest-VolumesAdminNegativeTest-server-247130899',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-247130899',id=10,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHvL2UEJPc4nJnMX0NLHsUyPpamaI8REueYO620VKU6jmG9moA3aOhnIV+8OJ4FygGtNs0JXD2mYZ/x6dT7j7bCftPAI8gs/5YWqGZxyEGNZggDwOTj0cc8sKDuS204Umw==',key_name='tempest-keypair-1405486078',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a92cea9e1182477ca669c506b42eda60',ramdisk_id='',reservation_id='r-0jl8eopp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-978356230',owner_user_name='tempest-VolumesAdminNegativeTest-978356230-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:03:56Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c92692a1d38b4531a4e7f42660a54c7b',uuid=dc918ed4-8bc6-4a4f-a189-d6cdd5817854,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "74703b46-6b03-4752-953b-9c64a63249c8", "address": "fa:16:3e:c5:94:d0", "network": {"id": "40132b20-6bfd-4f5a-8f6f-75769961d157", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-683065417-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a92cea9e1182477ca669c506b42eda60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap74703b46-6b", "ovs_interfaceid": "74703b46-6b03-4752-953b-9c64a63249c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71605) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Converting VIF {"id": "74703b46-6b03-4752-953b-9c64a63249c8", "address": "fa:16:3e:c5:94:d0", "network": {"id": "40132b20-6bfd-4f5a-8f6f-75769961d157", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-683065417-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a92cea9e1182477ca669c506b42eda60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap74703b46-6b", "ovs_interfaceid": "74703b46-6b03-4752-953b-9c64a63249c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c5:94:d0,bridge_name='br-int',has_traffic_filtering=True,id=74703b46-6b03-4752-953b-9c64a63249c8,network=Network(40132b20-6bfd-4f5a-8f6f-75769961d157),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74703b46-6b') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.objects.instance [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lazy-loading 'pci_devices' on Instance uuid dc918ed4-8bc6-4a4f-a189-d6cdd5817854 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] End _get_guest_xml xml= Apr 20 16:03:58 user nova-compute[71605]: dc918ed4-8bc6-4a4f-a189-d6cdd5817854 Apr 20 16:03:58 user nova-compute[71605]: instance-0000000a Apr 20 16:03:58 user nova-compute[71605]: 131072 Apr 20 16:03:58 user nova-compute[71605]: 1 Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: tempest-VolumesAdminNegativeTest-server-247130899 Apr 20 16:03:58 user nova-compute[71605]: 2023-04-20 16:03:58 Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: 128 Apr 20 16:03:58 user nova-compute[71605]: 1 Apr 20 16:03:58 user nova-compute[71605]: 0 Apr 20 16:03:58 user nova-compute[71605]: 0 Apr 20 16:03:58 user nova-compute[71605]: 1 Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: tempest-VolumesAdminNegativeTest-978356230-project-member Apr 20 16:03:58 user nova-compute[71605]: tempest-VolumesAdminNegativeTest-978356230 Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: OpenStack Foundation Apr 20 16:03:58 user nova-compute[71605]: OpenStack Nova Apr 20 16:03:58 user nova-compute[71605]: 0.0.0 Apr 20 16:03:58 user nova-compute[71605]: dc918ed4-8bc6-4a4f-a189-d6cdd5817854 Apr 20 16:03:58 user nova-compute[71605]: dc918ed4-8bc6-4a4f-a189-d6cdd5817854 Apr 20 16:03:58 user nova-compute[71605]: Virtual Machine Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: hvm Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Nehalem Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: /dev/urandom Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: Apr 20 16:03:58 user nova-compute[71605]: {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:03:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-247130899',display_name='tempest-VolumesAdminNegativeTest-server-247130899',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-247130899',id=10,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHvL2UEJPc4nJnMX0NLHsUyPpamaI8REueYO620VKU6jmG9moA3aOhnIV+8OJ4FygGtNs0JXD2mYZ/x6dT7j7bCftPAI8gs/5YWqGZxyEGNZggDwOTj0cc8sKDuS204Umw==',key_name='tempest-keypair-1405486078',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a92cea9e1182477ca669c506b42eda60',ramdisk_id='',reservation_id='r-0jl8eopp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-978356230',owner_user_name='tempest-VolumesAdminNegativeTest-978356230-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:03:56Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c92692a1d38b4531a4e7f42660a54c7b',uuid=dc918ed4-8bc6-4a4f-a189-d6cdd5817854,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "74703b46-6b03-4752-953b-9c64a63249c8", "address": "fa:16:3e:c5:94:d0", "network": {"id": "40132b20-6bfd-4f5a-8f6f-75769961d157", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-683065417-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a92cea9e1182477ca669c506b42eda60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap74703b46-6b", "ovs_interfaceid": "74703b46-6b03-4752-953b-9c64a63249c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Converting VIF {"id": "74703b46-6b03-4752-953b-9c64a63249c8", "address": "fa:16:3e:c5:94:d0", "network": {"id": "40132b20-6bfd-4f5a-8f6f-75769961d157", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-683065417-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a92cea9e1182477ca669c506b42eda60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap74703b46-6b", "ovs_interfaceid": "74703b46-6b03-4752-953b-9c64a63249c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c5:94:d0,bridge_name='br-int',has_traffic_filtering=True,id=74703b46-6b03-4752-953b-9c64a63249c8,network=Network(40132b20-6bfd-4f5a-8f6f-75769961d157),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74703b46-6b') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG os_vif [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:94:d0,bridge_name='br-int',has_traffic_filtering=True,id=74703b46-6b03-4752-953b-9c64a63249c8,network=Network(40132b20-6bfd-4f5a-8f6f-75769961d157),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74703b46-6b') {{(pid=71605) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap74703b46-6b, may_exist=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap74703b46-6b, col_values=(('external_ids', {'iface-id': '74703b46-6b03-4752-953b-9c64a63249c8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c5:94:d0', 'vm-uuid': 'dc918ed4-8bc6-4a4f-a189-d6cdd5817854'}),)) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:58 user nova-compute[71605]: INFO os_vif [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:94:d0,bridge_name='br-int',has_traffic_filtering=True,id=74703b46-6b03-4752-953b-9c64a63249c8,network=Network(40132b20-6bfd-4f5a-8f6f-75769961d157),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74703b46-6b') Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] No BDM found with device name vda, not building metadata. {{(pid=71605) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] No VIF found with MAC fa:16:3e:c5:94:d0, not building metadata {{(pid=71605) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Resumed> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:03:58 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] VM Resumed (Lifecycle Event) Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.compute.manager [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Instance event wait completed in 0 seconds for {{(pid=71605) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Guest created on hypervisor {{(pid=71605) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 16:03:58 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Instance spawned successfully. Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Found default for hw_cdrom_bus of ide {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Found default for hw_disk_bus of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Found default for hw_input_bus of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Found default for hw_pointer_model of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Found default for hw_video_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Found default for hw_vif_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:58 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Started> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:03:58 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] VM Started (Lifecycle Event) Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:03:58 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:03:58 user nova-compute[71605]: INFO nova.compute.manager [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Took 10.65 seconds to spawn the instance on the hypervisor. Apr 20 16:03:58 user nova-compute[71605]: DEBUG nova.compute.manager [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:03:58 user nova-compute[71605]: INFO nova.compute.manager [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Took 11.81 seconds to build instance. Apr 20 16:03:58 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bbc47a67-5c04-448f-980d-e4392a6e5558 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "e8f62d46-e2dc-4870-adf1-f62d88bb653b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.905s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:59 user nova-compute[71605]: DEBUG nova.compute.manager [req-f9a9a0ac-d905-4d3d-ae9e-f35253773d3c req-04d7a079-a529-401e-96ba-e5ad51bee183 service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Received event network-vif-plugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:03:59 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-f9a9a0ac-d905-4d3d-ae9e-f35253773d3c req-04d7a079-a529-401e-96ba-e5ad51bee183 service nova] Acquiring lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:03:59 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-f9a9a0ac-d905-4d3d-ae9e-f35253773d3c req-04d7a079-a529-401e-96ba-e5ad51bee183 service nova] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:03:59 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-f9a9a0ac-d905-4d3d-ae9e-f35253773d3c req-04d7a079-a529-401e-96ba-e5ad51bee183 service nova] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:59 user nova-compute[71605]: DEBUG nova.compute.manager [req-f9a9a0ac-d905-4d3d-ae9e-f35253773d3c req-04d7a079-a529-401e-96ba-e5ad51bee183 service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] No waiting events found dispatching network-vif-plugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:03:59 user nova-compute[71605]: WARNING nova.compute.manager [req-f9a9a0ac-d905-4d3d-ae9e-f35253773d3c req-04d7a079-a529-401e-96ba-e5ad51bee183 service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Received unexpected event network-vif-plugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad for instance with vm_state building and task_state spawning. Apr 20 16:03:59 user nova-compute[71605]: DEBUG nova.compute.manager [req-f9a9a0ac-d905-4d3d-ae9e-f35253773d3c req-04d7a079-a529-401e-96ba-e5ad51bee183 service nova] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Received event network-changed-74703b46-6b03-4752-953b-9c64a63249c8 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:03:59 user nova-compute[71605]: DEBUG nova.compute.manager [req-f9a9a0ac-d905-4d3d-ae9e-f35253773d3c req-04d7a079-a529-401e-96ba-e5ad51bee183 service nova] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Refreshing instance network info cache due to event network-changed-74703b46-6b03-4752-953b-9c64a63249c8. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:03:59 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-f9a9a0ac-d905-4d3d-ae9e-f35253773d3c req-04d7a079-a529-401e-96ba-e5ad51bee183 service nova] Acquiring lock "refresh_cache-dc918ed4-8bc6-4a4f-a189-d6cdd5817854" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:03:59 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-f9a9a0ac-d905-4d3d-ae9e-f35253773d3c req-04d7a079-a529-401e-96ba-e5ad51bee183 service nova] Acquired lock "refresh_cache-dc918ed4-8bc6-4a4f-a189-d6cdd5817854" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:03:59 user nova-compute[71605]: DEBUG nova.network.neutron [req-f9a9a0ac-d905-4d3d-ae9e-f35253773d3c req-04d7a079-a529-401e-96ba-e5ad51bee183 service nova] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Refreshing network info cache for port 74703b46-6b03-4752-953b-9c64a63249c8 {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:03:59 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Resumed> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:03:59 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] VM Resumed (Lifecycle Event) Apr 20 16:03:59 user nova-compute[71605]: DEBUG nova.compute.manager [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Instance event wait completed in 0 seconds for {{(pid=71605) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 16:03:59 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Guest created on hypervisor {{(pid=71605) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 16:03:59 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Instance spawned successfully. Apr 20 16:03:59 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 16:03:59 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Found default for hw_cdrom_bus of ide {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:59 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Found default for hw_disk_bus of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:59 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Found default for hw_input_bus of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:59 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Found default for hw_pointer_model of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:59 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Found default for hw_video_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:59 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Found default for hw_vif_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:03:59 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:03:59 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:03:59 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:03:59 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Started> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:03:59 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] VM Started (Lifecycle Event) Apr 20 16:03:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:59 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:03:59 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:03:59 user nova-compute[71605]: INFO nova.compute.manager [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Took 10.16 seconds to spawn the instance on the hypervisor. Apr 20 16:03:59 user nova-compute[71605]: DEBUG nova.compute.manager [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:03:59 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:03:59 user nova-compute[71605]: INFO nova.compute.manager [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Took 11.14 seconds to build instance. Apr 20 16:03:59 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cc1c18a3-19de-4d90-b0cf-4dd113b494e0 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.281s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:03:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:03:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:00 user nova-compute[71605]: DEBUG nova.network.neutron [req-f9a9a0ac-d905-4d3d-ae9e-f35253773d3c req-04d7a079-a529-401e-96ba-e5ad51bee183 service nova] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Updated VIF entry in instance network info cache for port 74703b46-6b03-4752-953b-9c64a63249c8. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:04:00 user nova-compute[71605]: DEBUG nova.network.neutron [req-f9a9a0ac-d905-4d3d-ae9e-f35253773d3c req-04d7a079-a529-401e-96ba-e5ad51bee183 service nova] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Updating instance_info_cache with network_info: [{"id": "74703b46-6b03-4752-953b-9c64a63249c8", "address": "fa:16:3e:c5:94:d0", "network": {"id": "40132b20-6bfd-4f5a-8f6f-75769961d157", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-683065417-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a92cea9e1182477ca669c506b42eda60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap74703b46-6b", "ovs_interfaceid": "74703b46-6b03-4752-953b-9c64a63249c8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:04:00 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-f9a9a0ac-d905-4d3d-ae9e-f35253773d3c req-04d7a079-a529-401e-96ba-e5ad51bee183 service nova] Releasing lock "refresh_cache-dc918ed4-8bc6-4a4f-a189-d6cdd5817854" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:04:00 user nova-compute[71605]: DEBUG nova.compute.manager [req-09925632-1824-4ed1-a3ee-08dadf6683b6 req-b2351065-a771-436b-9faf-f5c7092246f3 service nova] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Received event network-vif-plugged-74703b46-6b03-4752-953b-9c64a63249c8 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:04:00 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-09925632-1824-4ed1-a3ee-08dadf6683b6 req-b2351065-a771-436b-9faf-f5c7092246f3 service nova] Acquiring lock "dc918ed4-8bc6-4a4f-a189-d6cdd5817854-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:04:00 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-09925632-1824-4ed1-a3ee-08dadf6683b6 req-b2351065-a771-436b-9faf-f5c7092246f3 service nova] Lock "dc918ed4-8bc6-4a4f-a189-d6cdd5817854-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:04:00 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-09925632-1824-4ed1-a3ee-08dadf6683b6 req-b2351065-a771-436b-9faf-f5c7092246f3 service nova] Lock "dc918ed4-8bc6-4a4f-a189-d6cdd5817854-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:04:00 user nova-compute[71605]: DEBUG nova.compute.manager [req-09925632-1824-4ed1-a3ee-08dadf6683b6 req-b2351065-a771-436b-9faf-f5c7092246f3 service nova] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] No waiting events found dispatching network-vif-plugged-74703b46-6b03-4752-953b-9c64a63249c8 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:04:00 user nova-compute[71605]: WARNING nova.compute.manager [req-09925632-1824-4ed1-a3ee-08dadf6683b6 req-b2351065-a771-436b-9faf-f5c7092246f3 service nova] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Received unexpected event network-vif-plugged-74703b46-6b03-4752-953b-9c64a63249c8 for instance with vm_state building and task_state spawning. Apr 20 16:04:00 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:00 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:00 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:02 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Resumed> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:04:02 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] VM Resumed (Lifecycle Event) Apr 20 16:04:02 user nova-compute[71605]: DEBUG nova.compute.manager [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Instance event wait completed in 0 seconds for {{(pid=71605) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 16:04:02 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Guest created on hypervisor {{(pid=71605) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 16:04:02 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Instance spawned successfully. Apr 20 16:04:02 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 16:04:02 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:04:02 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:04:02 user nova-compute[71605]: DEBUG nova.compute.manager [req-d6d2fb4e-68b4-4825-97fd-6b851eefe268 req-0805ae4c-2494-42b6-9e2f-9d1fe327e9b8 service nova] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Received event network-vif-plugged-74703b46-6b03-4752-953b-9c64a63249c8 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:04:02 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-d6d2fb4e-68b4-4825-97fd-6b851eefe268 req-0805ae4c-2494-42b6-9e2f-9d1fe327e9b8 service nova] Acquiring lock "dc918ed4-8bc6-4a4f-a189-d6cdd5817854-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:04:02 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-d6d2fb4e-68b4-4825-97fd-6b851eefe268 req-0805ae4c-2494-42b6-9e2f-9d1fe327e9b8 service nova] Lock "dc918ed4-8bc6-4a4f-a189-d6cdd5817854-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:04:02 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-d6d2fb4e-68b4-4825-97fd-6b851eefe268 req-0805ae4c-2494-42b6-9e2f-9d1fe327e9b8 service nova] Lock "dc918ed4-8bc6-4a4f-a189-d6cdd5817854-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:04:02 user nova-compute[71605]: DEBUG nova.compute.manager [req-d6d2fb4e-68b4-4825-97fd-6b851eefe268 req-0805ae4c-2494-42b6-9e2f-9d1fe327e9b8 service nova] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] No waiting events found dispatching network-vif-plugged-74703b46-6b03-4752-953b-9c64a63249c8 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:04:02 user nova-compute[71605]: WARNING nova.compute.manager [req-d6d2fb4e-68b4-4825-97fd-6b851eefe268 req-0805ae4c-2494-42b6-9e2f-9d1fe327e9b8 service nova] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Received unexpected event network-vif-plugged-74703b46-6b03-4752-953b-9c64a63249c8 for instance with vm_state building and task_state spawning. Apr 20 16:04:02 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Found default for hw_cdrom_bus of ide {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:04:02 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Found default for hw_disk_bus of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:04:02 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Found default for hw_input_bus of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:04:02 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Found default for hw_pointer_model of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:04:02 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Found default for hw_video_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:04:02 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Found default for hw_vif_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:04:02 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:04:02 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Started> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:04:02 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] VM Started (Lifecycle Event) Apr 20 16:04:02 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:04:02 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:04:02 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:04:02 user nova-compute[71605]: INFO nova.compute.manager [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Took 6.51 seconds to spawn the instance on the hypervisor. Apr 20 16:04:02 user nova-compute[71605]: DEBUG nova.compute.manager [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:04:02 user nova-compute[71605]: INFO nova.compute.manager [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Took 7.77 seconds to build instance. Apr 20 16:04:02 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-37a43ff0-48b5-4efe-a058-06b25bebfc7a tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "dc918ed4-8bc6-4a4f-a189-d6cdd5817854" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.982s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:04:03 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:03 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:04 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:08 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:09 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:09 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:13 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:14 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:18 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:18 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:19 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:20 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:23 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:24 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:24 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:28 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:29 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:30 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:33 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:38 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:04:38 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:39 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:04:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:04:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:04:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:04:39 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:04:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:39 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:39 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:39 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:39 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:39 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:39 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:39 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:39 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk --force-share --output=json" returned: 0 in 0.127s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:39 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dc918ed4-8bc6-4a4f-a189-d6cdd5817854/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dc918ed4-8bc6-4a4f-a189-d6cdd5817854/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dc918ed4-8bc6-4a4f-a189-d6cdd5817854/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dc918ed4-8bc6-4a4f-a189-d6cdd5817854/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5bda996a-1bfe-4f43-aa02-36a864153588/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5bda996a-1bfe-4f43-aa02-36a864153588/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5bda996a-1bfe-4f43-aa02-36a864153588/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5bda996a-1bfe-4f43-aa02-36a864153588/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/65fc650d-2181-46cb-b91b-4a1104b2afab/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/65fc650d-2181-46cb-b91b-4a1104b2afab/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/65fc650d-2181-46cb-b91b-4a1104b2afab/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/65fc650d-2181-46cb-b91b-4a1104b2afab/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6d55e5bd-9b03-40a9-bca9-88545039597c/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6d55e5bd-9b03-40a9-bca9-88545039597c/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6d55e5bd-9b03-40a9-bca9-88545039597c/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6d55e5bd-9b03-40a9-bca9-88545039597c/disk --force-share --output=json" returned: 0 in 0.159s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91f4b3d1-0fea-4378-94e3-c2bbfd8cad81/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91f4b3d1-0fea-4378-94e3-c2bbfd8cad81/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91f4b3d1-0fea-4378-94e3-c2bbfd8cad81/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91f4b3d1-0fea-4378-94e3-c2bbfd8cad81/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1036e0f-683f-4dfd-b0ad-6187d90ff2f6/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1036e0f-683f-4dfd-b0ad-6187d90ff2f6/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1036e0f-683f-4dfd-b0ad-6187d90ff2f6/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e1036e0f-683f-4dfd-b0ad-6187d90ff2f6/disk --force-share --output=json" returned: 0 in 0.127s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a5e68386-3b32-458b-9808-797d041c2235/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a5e68386-3b32-458b-9808-797d041c2235/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a5e68386-3b32-458b-9808-797d041c2235/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a5e68386-3b32-458b-9808-797d041c2235/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:42 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:04:42 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:04:42 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=7727MB free_disk=26.249740600585938GB free_vcpus=2 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:04:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:04:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:04:42 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance 65fc650d-2181-46cb-b91b-4a1104b2afab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:04:42 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance e8f62d46-e2dc-4870-adf1-f62d88bb653b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:04:42 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance fe0bde76-a4f8-4865-91af-2bd3790587a7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:04:42 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance 6d55e5bd-9b03-40a9-bca9-88545039597c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:04:42 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance a5e68386-3b32-458b-9808-797d041c2235 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:04:42 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance e1036e0f-683f-4dfd-b0ad-6187d90ff2f6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:04:42 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance dc918ed4-8bc6-4a4f-a189-d6cdd5817854 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:04:42 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance 5bda996a-1bfe-4f43-aa02-36a864153588 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:04:42 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance d4ea4d29-b178-4da2-b971-76f97031b244 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:04:42 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:04:42 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 10 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:04:42 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=1792MB phys_disk=40GB used_disk=10GB total_vcpus=12 used_vcpus=10 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:04:43 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:04:43 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:04:43 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:04:43 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.485s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:04:43 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:44 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:04:44 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:04:44 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:04:44 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:04:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c4425233-7e2f-4a35-b20c-07012c2fd841 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Acquiring lock "6d55e5bd-9b03-40a9-bca9-88545039597c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:04:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c4425233-7e2f-4a35-b20c-07012c2fd841 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Lock "6d55e5bd-9b03-40a9-bca9-88545039597c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:04:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c4425233-7e2f-4a35-b20c-07012c2fd841 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Acquiring lock "6d55e5bd-9b03-40a9-bca9-88545039597c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:04:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c4425233-7e2f-4a35-b20c-07012c2fd841 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Lock "6d55e5bd-9b03-40a9-bca9-88545039597c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:04:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c4425233-7e2f-4a35-b20c-07012c2fd841 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Lock "6d55e5bd-9b03-40a9-bca9-88545039597c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:04:44 user nova-compute[71605]: INFO nova.compute.manager [None req-c4425233-7e2f-4a35-b20c-07012c2fd841 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Terminating instance Apr 20 16:04:44 user nova-compute[71605]: DEBUG nova.compute.manager [None req-c4425233-7e2f-4a35-b20c-07012c2fd841 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Start destroying the instance on the hypervisor. {{(pid=71605) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 16:04:44 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:44 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:44 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:44 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:44 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "refresh_cache-d4ea4d29-b178-4da2-b971-76f97031b244" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:04:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquired lock "refresh_cache-d4ea4d29-b178-4da2-b971-76f97031b244" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:04:44 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Forcefully refreshing network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 16:04:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Acquiring lock "dd78d74a-11d6-4f06-8092-5088b3fad412" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:04:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "dd78d74a-11d6-4f06-8092-5088b3fad412" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:04:44 user nova-compute[71605]: DEBUG nova.compute.manager [req-1005a8ff-044c-4fbe-803e-646fe4a4ae93 req-cf1a7827-6f0c-4639-98b8-4100cebfbe1b service nova] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Received event network-vif-unplugged-fe98bff4-7b0f-4244-a254-fc9359c00aae {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:04:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-1005a8ff-044c-4fbe-803e-646fe4a4ae93 req-cf1a7827-6f0c-4639-98b8-4100cebfbe1b service nova] Acquiring lock "6d55e5bd-9b03-40a9-bca9-88545039597c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:04:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-1005a8ff-044c-4fbe-803e-646fe4a4ae93 req-cf1a7827-6f0c-4639-98b8-4100cebfbe1b service nova] Lock "6d55e5bd-9b03-40a9-bca9-88545039597c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:04:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-1005a8ff-044c-4fbe-803e-646fe4a4ae93 req-cf1a7827-6f0c-4639-98b8-4100cebfbe1b service nova] Lock "6d55e5bd-9b03-40a9-bca9-88545039597c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:04:44 user nova-compute[71605]: DEBUG nova.compute.manager [req-1005a8ff-044c-4fbe-803e-646fe4a4ae93 req-cf1a7827-6f0c-4639-98b8-4100cebfbe1b service nova] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] No waiting events found dispatching network-vif-unplugged-fe98bff4-7b0f-4244-a254-fc9359c00aae {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:04:44 user nova-compute[71605]: DEBUG nova.compute.manager [req-1005a8ff-044c-4fbe-803e-646fe4a4ae93 req-cf1a7827-6f0c-4639-98b8-4100cebfbe1b service nova] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Received event network-vif-unplugged-fe98bff4-7b0f-4244-a254-fc9359c00aae for instance with task_state deleting. {{(pid=71605) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 16:04:44 user nova-compute[71605]: DEBUG nova.compute.manager [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Starting instance... {{(pid=71605) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 16:04:45 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Instance destroyed successfully. Apr 20 16:04:45 user nova-compute[71605]: DEBUG nova.objects.instance [None req-c4425233-7e2f-4a35-b20c-07012c2fd841 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Lazy-loading 'resources' on Instance uuid 6d55e5bd-9b03-40a9-bca9-88545039597c {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71605) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 16:04:45 user nova-compute[71605]: INFO nova.compute.claims [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Claim successful on node user Apr 20 16:04:45 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-c4425233-7e2f-4a35-b20c-07012c2fd841 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:02:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-808094645',display_name='tempest-DeleteServersTestJSON-server-808094645',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-808094645',id=3,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-20T16:03:12Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='3336309776d848efaf237863a5b9bfeb',ramdisk_id='',reservation_id='r-scto8378',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-DeleteServersTestJSON-1315524687',owner_user_name='tempest-DeleteServersTestJSON-1315524687-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T16:03:12Z,user_data=None,user_id='8a7606e886554ff7948a4e246dd98677',uuid=6d55e5bd-9b03-40a9-bca9-88545039597c,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fe98bff4-7b0f-4244-a254-fc9359c00aae", "address": "fa:16:3e:8f:d5:e9", "network": {"id": "892f26a6-0815-41df-a910-d2e69e162820", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1019885347-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3336309776d848efaf237863a5b9bfeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe98bff4-7b", "ovs_interfaceid": "fe98bff4-7b0f-4244-a254-fc9359c00aae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-c4425233-7e2f-4a35-b20c-07012c2fd841 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Converting VIF {"id": "fe98bff4-7b0f-4244-a254-fc9359c00aae", "address": "fa:16:3e:8f:d5:e9", "network": {"id": "892f26a6-0815-41df-a910-d2e69e162820", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-1019885347-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3336309776d848efaf237863a5b9bfeb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe98bff4-7b", "ovs_interfaceid": "fe98bff4-7b0f-4244-a254-fc9359c00aae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-c4425233-7e2f-4a35-b20c-07012c2fd841 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:d5:e9,bridge_name='br-int',has_traffic_filtering=True,id=fe98bff4-7b0f-4244-a254-fc9359c00aae,network=Network(892f26a6-0815-41df-a910-d2e69e162820),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe98bff4-7b') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG os_vif [None req-c4425233-7e2f-4a35-b20c-07012c2fd841 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:d5:e9,bridge_name='br-int',has_traffic_filtering=True,id=fe98bff4-7b0f-4244-a254-fc9359c00aae,network=Network(892f26a6-0815-41df-a910-d2e69e162820),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe98bff4-7b') {{(pid=71605) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe98bff4-7b, bridge=br-int, if_exists=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:45 user nova-compute[71605]: INFO os_vif [None req-c4425233-7e2f-4a35-b20c-07012c2fd841 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:d5:e9,bridge_name='br-int',has_traffic_filtering=True,id=fe98bff4-7b0f-4244-a254-fc9359c00aae,network=Network(892f26a6-0815-41df-a910-d2e69e162820),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe98bff4-7b') Apr 20 16:04:45 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-c4425233-7e2f-4a35-b20c-07012c2fd841 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Deleting instance files /opt/stack/data/nova/instances/6d55e5bd-9b03-40a9-bca9-88545039597c_del Apr 20 16:04:45 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-c4425233-7e2f-4a35-b20c-07012c2fd841 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Deletion of /opt/stack/data/nova/instances/6d55e5bd-9b03-40a9-bca9-88545039597c_del complete Apr 20 16:04:45 user nova-compute[71605]: DEBUG nova.compute.manager [req-a5127230-c1c3-4b0b-bea0-c7371f7a15fd req-8816c017-a649-4e97-a4a0-f302eaad2165 service nova] [instance: a5e68386-3b32-458b-9808-797d041c2235] Received event network-changed-4bce4922-407c-4e11-b089-154a3299ea1c {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG nova.compute.manager [req-a5127230-c1c3-4b0b-bea0-c7371f7a15fd req-8816c017-a649-4e97-a4a0-f302eaad2165 service nova] [instance: a5e68386-3b32-458b-9808-797d041c2235] Refreshing instance network info cache due to event network-changed-4bce4922-407c-4e11-b089-154a3299ea1c. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-a5127230-c1c3-4b0b-bea0-c7371f7a15fd req-8816c017-a649-4e97-a4a0-f302eaad2165 service nova] Acquiring lock "refresh_cache-a5e68386-3b32-458b-9808-797d041c2235" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-a5127230-c1c3-4b0b-bea0-c7371f7a15fd req-8816c017-a649-4e97-a4a0-f302eaad2165 service nova] Acquired lock "refresh_cache-a5e68386-3b32-458b-9808-797d041c2235" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG nova.network.neutron [req-a5127230-c1c3-4b0b-bea0-c7371f7a15fd req-8816c017-a649-4e97-a4a0-f302eaad2165 service nova] [instance: a5e68386-3b32-458b-9808-797d041c2235] Refreshing network info cache for port 4bce4922-407c-4e11-b089-154a3299ea1c {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-c4425233-7e2f-4a35-b20c-07012c2fd841 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Checking UEFI support for host arch (x86_64) {{(pid=71605) supports_uefi /opt/stack/nova/nova/virt/libvirt/host.py:1722}} Apr 20 16:04:45 user nova-compute[71605]: INFO nova.virt.libvirt.host [None req-c4425233-7e2f-4a35-b20c-07012c2fd841 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] UEFI support detected Apr 20 16:04:45 user nova-compute[71605]: INFO nova.compute.manager [None req-c4425233-7e2f-4a35-b20c-07012c2fd841 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Took 0.98 seconds to destroy the instance on the hypervisor. Apr 20 16:04:45 user nova-compute[71605]: DEBUG oslo.service.loopingcall [None req-c4425233-7e2f-4a35-b20c-07012c2fd841 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71605) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG nova.compute.manager [-] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Deallocating network for instance {{(pid=71605) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] deallocate_for_instance() {{(pid=71605) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Updating instance_info_cache with network_info: [{"id": "0b36b1a4-9ab6-49cb-9a5e-afc32792783e", "address": "fa:16:3e:44:d8:d0", "network": {"id": "c36830a6-66f7-4f28-8879-e228da46cead", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-655574662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8444d3c8f554a56967917670b19dc37", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b36b1a4-9a", "ovs_interfaceid": "0b36b1a4-9ab6-49cb-9a5e-afc32792783e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.473s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG nova.compute.manager [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Start building networks asynchronously for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Releasing lock "refresh_cache-d4ea4d29-b178-4da2-b971-76f97031b244" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Updated the network info_cache for instance {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG nova.compute.manager [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Allocating IP information in the background. {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG nova.network.neutron [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] allocate_for_instance() {{(pid=71605) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 16:04:45 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 16:04:45 user nova-compute[71605]: DEBUG nova.compute.manager [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Start building block device mappings for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG nova.compute.manager [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Start spawning the instance on the hypervisor. {{(pid=71605) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Creating instance directory {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 16:04:45 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Creating image(s) Apr 20 16:04:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Acquiring lock "/opt/stack/data/nova/instances/dd78d74a-11d6-4f06-8092-5088b3fad412/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "/opt/stack/data/nova/instances/dd78d74a-11d6-4f06-8092-5088b3fad412/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "/opt/stack/data/nova/instances/dd78d74a-11d6-4f06-8092-5088b3fad412/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.003s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.136s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Acquiring lock "4030659dc9e6940e4f224066d06e3784b1229890" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:04:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.140s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/dd78d74a-11d6-4f06-8092-5088b3fad412/disk 1073741824 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/dd78d74a-11d6-4f06-8092-5088b3fad412/disk 1073741824" returned: 0 in 0.051s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.198s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG nova.policy [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9be25e958c6047068ab5ce63106b0754', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd8444d3c8f554a56967917670b19dc37', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71605) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.136s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Checking if we can resize image /opt/stack/data/nova/instances/dd78d74a-11d6-4f06-8092-5088b3fad412/disk. size=1073741824 {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dd78d74a-11d6-4f06-8092-5088b3fad412/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dd78d74a-11d6-4f06-8092-5088b3fad412/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Cannot resize image /opt/stack/data/nova/instances/dd78d74a-11d6-4f06-8092-5088b3fad412/disk to a smaller size. {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG nova.objects.instance [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lazy-loading 'migration_context' on Instance uuid dd78d74a-11d6-4f06-8092-5088b3fad412 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Created local disks {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Ensure instance console log exists: /opt/stack/data/nova/instances/dd78d74a-11d6-4f06-8092-5088b3fad412/console.log {{(pid=71605) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG nova.compute.manager [None req-3cb2fc70-596e-415f-ac12-97d5c168e6dc tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG nova.network.neutron [req-a5127230-c1c3-4b0b-bea0-c7371f7a15fd req-8816c017-a649-4e97-a4a0-f302eaad2165 service nova] [instance: a5e68386-3b32-458b-9808-797d041c2235] Updated VIF entry in instance network info cache for port 4bce4922-407c-4e11-b089-154a3299ea1c. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG nova.network.neutron [req-a5127230-c1c3-4b0b-bea0-c7371f7a15fd req-8816c017-a649-4e97-a4a0-f302eaad2165 service nova] [instance: a5e68386-3b32-458b-9808-797d041c2235] Updating instance_info_cache with network_info: [{"id": "4bce4922-407c-4e11-b089-154a3299ea1c", "address": "fa:16:3e:bd:61:95", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.17", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bce4922-40", "ovs_interfaceid": "4bce4922-407c-4e11-b089-154a3299ea1c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-a5127230-c1c3-4b0b-bea0-c7371f7a15fd req-8816c017-a649-4e97-a4a0-f302eaad2165 service nova] Releasing lock "refresh_cache-a5e68386-3b32-458b-9808-797d041c2235" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:04:46 user nova-compute[71605]: INFO nova.compute.manager [None req-3cb2fc70-596e-415f-ac12-97d5c168e6dc tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] instance snapshotting Apr 20 16:04:46 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Took 1.45 seconds to deallocate network for instance. Apr 20 16:04:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c4425233-7e2f-4a35-b20c-07012c2fd841 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c4425233-7e2f-4a35-b20c-07012c2fd841 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bed97145-0d0b-4c2f-98f8-f1496db2ba1b tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquiring lock "a5e68386-3b32-458b-9808-797d041c2235" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bed97145-0d0b-4c2f-98f8-f1496db2ba1b tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "a5e68386-3b32-458b-9808-797d041c2235" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bed97145-0d0b-4c2f-98f8-f1496db2ba1b tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquiring lock "a5e68386-3b32-458b-9808-797d041c2235-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bed97145-0d0b-4c2f-98f8-f1496db2ba1b tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "a5e68386-3b32-458b-9808-797d041c2235-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bed97145-0d0b-4c2f-98f8-f1496db2ba1b tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "a5e68386-3b32-458b-9808-797d041c2235-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:04:46 user nova-compute[71605]: INFO nova.compute.manager [None req-bed97145-0d0b-4c2f-98f8-f1496db2ba1b tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Terminating instance Apr 20 16:04:46 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-3cb2fc70-596e-415f-ac12-97d5c168e6dc tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Beginning live snapshot process Apr 20 16:04:46 user nova-compute[71605]: DEBUG nova.compute.manager [None req-bed97145-0d0b-4c2f-98f8-f1496db2ba1b tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Start destroying the instance on the hypervisor. {{(pid=71605) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG nova.compute.manager [req-a39603ef-4139-4283-b4da-ae09429d073c req-4188f1e4-d82a-4d8c-ac38-abcd6d42c6e3 service nova] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Received event network-vif-plugged-fe98bff4-7b0f-4244-a254-fc9359c00aae {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-a39603ef-4139-4283-b4da-ae09429d073c req-4188f1e4-d82a-4d8c-ac38-abcd6d42c6e3 service nova] Acquiring lock "6d55e5bd-9b03-40a9-bca9-88545039597c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-a39603ef-4139-4283-b4da-ae09429d073c req-4188f1e4-d82a-4d8c-ac38-abcd6d42c6e3 service nova] Lock "6d55e5bd-9b03-40a9-bca9-88545039597c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-a39603ef-4139-4283-b4da-ae09429d073c req-4188f1e4-d82a-4d8c-ac38-abcd6d42c6e3 service nova] Lock "6d55e5bd-9b03-40a9-bca9-88545039597c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG nova.compute.manager [req-a39603ef-4139-4283-b4da-ae09429d073c req-4188f1e4-d82a-4d8c-ac38-abcd6d42c6e3 service nova] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] No waiting events found dispatching network-vif-plugged-fe98bff4-7b0f-4244-a254-fc9359c00aae {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:04:47 user nova-compute[71605]: WARNING nova.compute.manager [req-a39603ef-4139-4283-b4da-ae09429d073c req-4188f1e4-d82a-4d8c-ac38-abcd6d42c6e3 service nova] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Received unexpected event network-vif-plugged-fe98bff4-7b0f-4244-a254-fc9359c00aae for instance with vm_state deleted and task_state None. Apr 20 16:04:47 user nova-compute[71605]: DEBUG nova.compute.manager [req-a39603ef-4139-4283-b4da-ae09429d073c req-4188f1e4-d82a-4d8c-ac38-abcd6d42c6e3 service nova] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Received event network-changed-b2af67f0-0768-4ebc-a21b-0ef6e2b3f264 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG nova.compute.manager [req-a39603ef-4139-4283-b4da-ae09429d073c req-4188f1e4-d82a-4d8c-ac38-abcd6d42c6e3 service nova] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Refreshing instance network info cache due to event network-changed-b2af67f0-0768-4ebc-a21b-0ef6e2b3f264. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-a39603ef-4139-4283-b4da-ae09429d073c req-4188f1e4-d82a-4d8c-ac38-abcd6d42c6e3 service nova] Acquiring lock "refresh_cache-91f4b3d1-0fea-4378-94e3-c2bbfd8cad81" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-a39603ef-4139-4283-b4da-ae09429d073c req-4188f1e4-d82a-4d8c-ac38-abcd6d42c6e3 service nova] Acquired lock "refresh_cache-91f4b3d1-0fea-4378-94e3-c2bbfd8cad81" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG nova.network.neutron [req-a39603ef-4139-4283-b4da-ae09429d073c req-4188f1e4-d82a-4d8c-ac38-abcd6d42c6e3 service nova] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Refreshing network info cache for port b2af67f0-0768-4ebc-a21b-0ef6e2b3f264 {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-3cb2fc70-596e-415f-ac12-97d5c168e6dc tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91f4b3d1-0fea-4378-94e3-c2bbfd8cad81/disk --force-share --output=json -f qcow2 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-c4425233-7e2f-4a35-b20c-07012c2fd841 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-c4425233-7e2f-4a35-b20c-07012c2fd841 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-3cb2fc70-596e-415f-ac12-97d5c168e6dc tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91f4b3d1-0fea-4378-94e3-c2bbfd8cad81/disk --force-share --output=json -f qcow2" returned: 0 in 0.152s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-3cb2fc70-596e-415f-ac12-97d5c168e6dc tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91f4b3d1-0fea-4378-94e3-c2bbfd8cad81/disk --force-share --output=json -f qcow2 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c4425233-7e2f-4a35-b20c-07012c2fd841 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.481s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG nova.compute.manager [req-fb111c12-b3ef-4091-8bc6-37c024cbdf6f req-60c5761e-8dfb-4d03-9a1c-b7f43c388782 service nova] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Received event network-vif-deleted-fe98bff4-7b0f-4244-a254-fc9359c00aae {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:04:47 user nova-compute[71605]: INFO nova.scheduler.client.report [None req-c4425233-7e2f-4a35-b20c-07012c2fd841 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Deleted allocations for instance 6d55e5bd-9b03-40a9-bca9-88545039597c Apr 20 16:04:47 user nova-compute[71605]: DEBUG nova.compute.manager [req-130cfb76-3537-4ac4-b806-47a611fd36b1 req-0e3130ac-2c50-4243-97e1-63e733704245 service nova] [instance: a5e68386-3b32-458b-9808-797d041c2235] Received event network-vif-unplugged-4bce4922-407c-4e11-b089-154a3299ea1c {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-130cfb76-3537-4ac4-b806-47a611fd36b1 req-0e3130ac-2c50-4243-97e1-63e733704245 service nova] Acquiring lock "a5e68386-3b32-458b-9808-797d041c2235-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-130cfb76-3537-4ac4-b806-47a611fd36b1 req-0e3130ac-2c50-4243-97e1-63e733704245 service nova] Lock "a5e68386-3b32-458b-9808-797d041c2235-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-130cfb76-3537-4ac4-b806-47a611fd36b1 req-0e3130ac-2c50-4243-97e1-63e733704245 service nova] Lock "a5e68386-3b32-458b-9808-797d041c2235-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG nova.compute.manager [req-130cfb76-3537-4ac4-b806-47a611fd36b1 req-0e3130ac-2c50-4243-97e1-63e733704245 service nova] [instance: a5e68386-3b32-458b-9808-797d041c2235] No waiting events found dispatching network-vif-unplugged-4bce4922-407c-4e11-b089-154a3299ea1c {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG nova.compute.manager [req-130cfb76-3537-4ac4-b806-47a611fd36b1 req-0e3130ac-2c50-4243-97e1-63e733704245 service nova] [instance: a5e68386-3b32-458b-9808-797d041c2235] Received event network-vif-unplugged-4bce4922-407c-4e11-b089-154a3299ea1c for instance with task_state deleting. {{(pid=71605) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-3cb2fc70-596e-415f-ac12-97d5c168e6dc tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91f4b3d1-0fea-4378-94e3-c2bbfd8cad81/disk --force-share --output=json -f qcow2" returned: 0 in 0.139s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-3cb2fc70-596e-415f-ac12-97d5c168e6dc tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c4425233-7e2f-4a35-b20c-07012c2fd841 tempest-DeleteServersTestJSON-1315524687 tempest-DeleteServersTestJSON-1315524687-project-member] Lock "6d55e5bd-9b03-40a9-bca9-88545039597c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 3.191s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-3cb2fc70-596e-415f-ac12-97d5c168e6dc tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.143s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-3cb2fc70-596e-415f-ac12-97d5c168e6dc tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpfli2chtz/2c4bcdd2148c48b182cb148d62fe608b.delta 1073741824 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-3cb2fc70-596e-415f-ac12-97d5c168e6dc tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpfli2chtz/2c4bcdd2148c48b182cb148d62fe608b.delta 1073741824" returned: 0 in 0.054s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:47 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-3cb2fc70-596e-415f-ac12-97d5c168e6dc tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Quiescing instance not available: QEMU guest agent is not enabled. Apr 20 16:04:47 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: a5e68386-3b32-458b-9808-797d041c2235] Instance destroyed successfully. Apr 20 16:04:47 user nova-compute[71605]: DEBUG nova.objects.instance [None req-bed97145-0d0b-4c2f-98f8-f1496db2ba1b tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lazy-loading 'resources' on Instance uuid a5e68386-3b32-458b-9808-797d041c2235 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-bed97145-0d0b-4c2f-98f8-f1496db2ba1b tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:02:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-112425079',display_name='tempest-AttachVolumeNegativeTest-server-112425079',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-112425079',id=1,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBFt/ltdSvvBHQ2MsuXJOTGRFwD86myzO9h0omThgGXoNYZmwXr9cWEFLEKbGl6QHLxLCdivOfggvbdx8hlLQgYsXTya/bJWP27fOABo2+ny5YKslC9RhYnn4AafsHJgFg==',key_name='tempest-keypair-1817335126',keypairs=,launch_index=0,launched_at=2023-04-20T16:03:12Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='71cf2664111f45788d24092e8ceede9c',ramdisk_id='',reservation_id='r-huby0z08',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-308436039',owner_user_name='tempest-AttachVolumeNegativeTest-308436039-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T16:03:13Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='690c49feae904687826fb959ba5ba283',uuid=a5e68386-3b32-458b-9808-797d041c2235,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4bce4922-407c-4e11-b089-154a3299ea1c", "address": "fa:16:3e:bd:61:95", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.17", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bce4922-40", "ovs_interfaceid": "4bce4922-407c-4e11-b089-154a3299ea1c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-bed97145-0d0b-4c2f-98f8-f1496db2ba1b tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Converting VIF {"id": "4bce4922-407c-4e11-b089-154a3299ea1c", "address": "fa:16:3e:bd:61:95", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.17", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bce4922-40", "ovs_interfaceid": "4bce4922-407c-4e11-b089-154a3299ea1c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-bed97145-0d0b-4c2f-98f8-f1496db2ba1b tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bd:61:95,bridge_name='br-int',has_traffic_filtering=True,id=4bce4922-407c-4e11-b089-154a3299ea1c,network=Network(2dc9b3da-0124-4718-9f70-a131cd030480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bce4922-40') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG os_vif [None req-bed97145-0d0b-4c2f-98f8-f1496db2ba1b tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:bd:61:95,bridge_name='br-int',has_traffic_filtering=True,id=4bce4922-407c-4e11-b089-154a3299ea1c,network=Network(2dc9b3da-0124-4718-9f70-a131cd030480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bce4922-40') {{(pid=71605) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4bce4922-40, bridge=br-int, if_exists=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:04:47 user nova-compute[71605]: INFO os_vif [None req-bed97145-0d0b-4c2f-98f8-f1496db2ba1b tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:bd:61:95,bridge_name='br-int',has_traffic_filtering=True,id=4bce4922-407c-4e11-b089-154a3299ea1c,network=Network(2dc9b3da-0124-4718-9f70-a131cd030480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bce4922-40') Apr 20 16:04:47 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-bed97145-0d0b-4c2f-98f8-f1496db2ba1b tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Deleting instance files /opt/stack/data/nova/instances/a5e68386-3b32-458b-9808-797d041c2235_del Apr 20 16:04:47 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-bed97145-0d0b-4c2f-98f8-f1496db2ba1b tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Deletion of /opt/stack/data/nova/instances/a5e68386-3b32-458b-9808-797d041c2235_del complete Apr 20 16:04:47 user nova-compute[71605]: DEBUG nova.network.neutron [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Successfully created port: 14dcc4ff-4a09-446a-b0ea-d9989cd3fa16 {{(pid=71605) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 16:04:47 user nova-compute[71605]: INFO nova.compute.manager [None req-bed97145-0d0b-4c2f-98f8-f1496db2ba1b tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: a5e68386-3b32-458b-9808-797d041c2235] Took 0.87 seconds to destroy the instance on the hypervisor. Apr 20 16:04:47 user nova-compute[71605]: DEBUG oslo.service.loopingcall [None req-bed97145-0d0b-4c2f-98f8-f1496db2ba1b tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71605) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG nova.compute.manager [-] [instance: a5e68386-3b32-458b-9808-797d041c2235] Deallocating network for instance {{(pid=71605) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 16:04:47 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: a5e68386-3b32-458b-9808-797d041c2235] deallocate_for_instance() {{(pid=71605) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 16:04:48 user nova-compute[71605]: DEBUG nova.virt.libvirt.guest [None req-3cb2fc70-596e-415f-ac12-97d5c168e6dc tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] COPY block job progress, current cursor: 0 final cursor: 43778048 {{(pid=71605) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 20 16:04:48 user nova-compute[71605]: DEBUG nova.network.neutron [req-a39603ef-4139-4283-b4da-ae09429d073c req-4188f1e4-d82a-4d8c-ac38-abcd6d42c6e3 service nova] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Updated VIF entry in instance network info cache for port b2af67f0-0768-4ebc-a21b-0ef6e2b3f264. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:04:48 user nova-compute[71605]: DEBUG nova.network.neutron [req-a39603ef-4139-4283-b4da-ae09429d073c req-4188f1e4-d82a-4d8c-ac38-abcd6d42c6e3 service nova] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Updating instance_info_cache with network_info: [{"id": "b2af67f0-0768-4ebc-a21b-0ef6e2b3f264", "address": "fa:16:3e:d0:3f:7b", "network": {"id": "224391e3-9d6f-4e5f-b1bb-00dd1cd0ea06", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1568684394-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.131", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fbbcfeb5266f4ca6b9738b18ba7d127e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2af67f0-07", "ovs_interfaceid": "b2af67f0-0768-4ebc-a21b-0ef6e2b3f264", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:04:48 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-a39603ef-4139-4283-b4da-ae09429d073c req-4188f1e4-d82a-4d8c-ac38-abcd6d42c6e3 service nova] Releasing lock "refresh_cache-91f4b3d1-0fea-4378-94e3-c2bbfd8cad81" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:04:48 user nova-compute[71605]: DEBUG nova.virt.libvirt.guest [None req-3cb2fc70-596e-415f-ac12-97d5c168e6dc tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] COPY block job progress, current cursor: 43778048 final cursor: 43778048 {{(pid=71605) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 20 16:04:48 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-3cb2fc70-596e-415f-ac12-97d5c168e6dc tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Skipping quiescing instance: QEMU guest agent is not enabled. Apr 20 16:04:48 user nova-compute[71605]: DEBUG nova.privsep.utils [None req-3cb2fc70-596e-415f-ac12-97d5c168e6dc tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71605) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 20 16:04:48 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-3cb2fc70-596e-415f-ac12-97d5c168e6dc tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpfli2chtz/2c4bcdd2148c48b182cb148d62fe608b.delta /opt/stack/data/nova/instances/snapshots/tmpfli2chtz/2c4bcdd2148c48b182cb148d62fe608b {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:04:49 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-3cb2fc70-596e-415f-ac12-97d5c168e6dc tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpfli2chtz/2c4bcdd2148c48b182cb148d62fe608b.delta /opt/stack/data/nova/instances/snapshots/tmpfli2chtz/2c4bcdd2148c48b182cb148d62fe608b" returned: 0 in 0.374s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:04:49 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-3cb2fc70-596e-415f-ac12-97d5c168e6dc tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Snapshot extracted, beginning image upload Apr 20 16:04:49 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:49 user nova-compute[71605]: DEBUG nova.compute.manager [req-9b78b4c3-4356-410c-bfe4-67399656cb0e req-7a5fcc3f-a425-47dc-9a1c-f0add6a8fb79 service nova] [instance: a5e68386-3b32-458b-9808-797d041c2235] Received event network-vif-plugged-4bce4922-407c-4e11-b089-154a3299ea1c {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:04:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-9b78b4c3-4356-410c-bfe4-67399656cb0e req-7a5fcc3f-a425-47dc-9a1c-f0add6a8fb79 service nova] Acquiring lock "a5e68386-3b32-458b-9808-797d041c2235-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:04:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-9b78b4c3-4356-410c-bfe4-67399656cb0e req-7a5fcc3f-a425-47dc-9a1c-f0add6a8fb79 service nova] Lock "a5e68386-3b32-458b-9808-797d041c2235-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:04:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-9b78b4c3-4356-410c-bfe4-67399656cb0e req-7a5fcc3f-a425-47dc-9a1c-f0add6a8fb79 service nova] Lock "a5e68386-3b32-458b-9808-797d041c2235-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:04:49 user nova-compute[71605]: DEBUG nova.compute.manager [req-9b78b4c3-4356-410c-bfe4-67399656cb0e req-7a5fcc3f-a425-47dc-9a1c-f0add6a8fb79 service nova] [instance: a5e68386-3b32-458b-9808-797d041c2235] No waiting events found dispatching network-vif-plugged-4bce4922-407c-4e11-b089-154a3299ea1c {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:04:49 user nova-compute[71605]: WARNING nova.compute.manager [req-9b78b4c3-4356-410c-bfe4-67399656cb0e req-7a5fcc3f-a425-47dc-9a1c-f0add6a8fb79 service nova] [instance: a5e68386-3b32-458b-9808-797d041c2235] Received unexpected event network-vif-plugged-4bce4922-407c-4e11-b089-154a3299ea1c for instance with vm_state active and task_state deleting. Apr 20 16:04:49 user nova-compute[71605]: DEBUG nova.compute.manager [req-9b78b4c3-4356-410c-bfe4-67399656cb0e req-7a5fcc3f-a425-47dc-9a1c-f0add6a8fb79 service nova] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Received event network-changed-5287c61f-56b9-4a9f-87e7-ab7057df84be {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:04:49 user nova-compute[71605]: DEBUG nova.compute.manager [req-9b78b4c3-4356-410c-bfe4-67399656cb0e req-7a5fcc3f-a425-47dc-9a1c-f0add6a8fb79 service nova] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Refreshing instance network info cache due to event network-changed-5287c61f-56b9-4a9f-87e7-ab7057df84be. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:04:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-9b78b4c3-4356-410c-bfe4-67399656cb0e req-7a5fcc3f-a425-47dc-9a1c-f0add6a8fb79 service nova] Acquiring lock "refresh_cache-5bda996a-1bfe-4f43-aa02-36a864153588" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:04:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-9b78b4c3-4356-410c-bfe4-67399656cb0e req-7a5fcc3f-a425-47dc-9a1c-f0add6a8fb79 service nova] Acquired lock "refresh_cache-5bda996a-1bfe-4f43-aa02-36a864153588" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:04:49 user nova-compute[71605]: DEBUG nova.network.neutron [req-9b78b4c3-4356-410c-bfe4-67399656cb0e req-7a5fcc3f-a425-47dc-9a1c-f0add6a8fb79 service nova] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Refreshing network info cache for port 5287c61f-56b9-4a9f-87e7-ab7057df84be {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:04:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a863b487-23ba-40d2-9af2-26600527c08e tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Acquiring lock "5bda996a-1bfe-4f43-aa02-36a864153588" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:04:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a863b487-23ba-40d2-9af2-26600527c08e tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "5bda996a-1bfe-4f43-aa02-36a864153588" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:04:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a863b487-23ba-40d2-9af2-26600527c08e tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Acquiring lock "5bda996a-1bfe-4f43-aa02-36a864153588-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:04:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a863b487-23ba-40d2-9af2-26600527c08e tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "5bda996a-1bfe-4f43-aa02-36a864153588-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:04:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a863b487-23ba-40d2-9af2-26600527c08e tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "5bda996a-1bfe-4f43-aa02-36a864153588-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:04:49 user nova-compute[71605]: INFO nova.compute.manager [None req-a863b487-23ba-40d2-9af2-26600527c08e tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Terminating instance Apr 20 16:04:49 user nova-compute[71605]: DEBUG nova.compute.manager [None req-a863b487-23ba-40d2-9af2-26600527c08e tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Start destroying the instance on the hypervisor. {{(pid=71605) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 16:04:49 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:49 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:49 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:50 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: a5e68386-3b32-458b-9808-797d041c2235] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:04:50 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: a5e68386-3b32-458b-9808-797d041c2235] Took 2.39 seconds to deallocate network for instance. Apr 20 16:04:50 user nova-compute[71605]: DEBUG nova.compute.manager [req-74c489b6-ec18-4cdd-a608-72e8dcd03ed6 req-a566fad6-3aa1-48e8-8598-fa78ba2eb15e service nova] [instance: a5e68386-3b32-458b-9808-797d041c2235] Received event network-vif-deleted-4bce4922-407c-4e11-b089-154a3299ea1c {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:04:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bed97145-0d0b-4c2f-98f8-f1496db2ba1b tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:04:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bed97145-0d0b-4c2f-98f8-f1496db2ba1b tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:04:50 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-bed97145-0d0b-4c2f-98f8-f1496db2ba1b tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:04:50 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-bed97145-0d0b-4c2f-98f8-f1496db2ba1b tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:04:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bed97145-0d0b-4c2f-98f8-f1496db2ba1b tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.333s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:04:50 user nova-compute[71605]: INFO nova.scheduler.client.report [None req-bed97145-0d0b-4c2f-98f8-f1496db2ba1b tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Deleted allocations for instance a5e68386-3b32-458b-9808-797d041c2235 Apr 20 16:04:50 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Instance destroyed successfully. Apr 20 16:04:50 user nova-compute[71605]: DEBUG nova.objects.instance [None req-a863b487-23ba-40d2-9af2-26600527c08e tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lazy-loading 'resources' on Instance uuid 5bda996a-1bfe-4f43-aa02-36a864153588 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:04:50 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-a863b487-23ba-40d2-9af2-26600527c08e tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:02:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-577930116',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-577930116',id=5,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMkHFFsWtozUTkF0VpQ+Cd6z15wOd291X4e8/v6QbZKdTx6+gptvNMQSpe0ybBenimgtpgGav2HnMz19ylSDLLeiOEgxywkrcPA8Jq0CjCrxBO54bQ0ViTd2ITYv71kQ9Q==',key_name='tempest-keypair-1173247378',keypairs=,launch_index=0,launched_at=2023-04-20T16:03:13Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='cb0a5eb3796a4d3a871843f409c6ffbd',ramdisk_id='',reservation_id='r-mig0m4d8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeShelveTestJSON-1118127371',owner_user_name='tempest-AttachVolumeShelveTestJSON-1118127371-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T16:03:13Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='f50dbce30f294bb0ba6bc2811025835d',uuid=5bda996a-1bfe-4f43-aa02-36a864153588,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5287c61f-56b9-4a9f-87e7-ab7057df84be", "address": "fa:16:3e:0c:45:0b", "network": {"id": "545a57d8-9d55-4ace-a0ad-635d7bc0ae52", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1085059550-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0a5eb3796a4d3a871843f409c6ffbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5287c61f-56", "ovs_interfaceid": "5287c61f-56b9-4a9f-87e7-ab7057df84be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 16:04:50 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-a863b487-23ba-40d2-9af2-26600527c08e tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Converting VIF {"id": "5287c61f-56b9-4a9f-87e7-ab7057df84be", "address": "fa:16:3e:0c:45:0b", "network": {"id": "545a57d8-9d55-4ace-a0ad-635d7bc0ae52", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1085059550-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0a5eb3796a4d3a871843f409c6ffbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5287c61f-56", "ovs_interfaceid": "5287c61f-56b9-4a9f-87e7-ab7057df84be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:04:50 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-a863b487-23ba-40d2-9af2-26600527c08e tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0c:45:0b,bridge_name='br-int',has_traffic_filtering=True,id=5287c61f-56b9-4a9f-87e7-ab7057df84be,network=Network(545a57d8-9d55-4ace-a0ad-635d7bc0ae52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5287c61f-56') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:04:50 user nova-compute[71605]: DEBUG os_vif [None req-a863b487-23ba-40d2-9af2-26600527c08e tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0c:45:0b,bridge_name='br-int',has_traffic_filtering=True,id=5287c61f-56b9-4a9f-87e7-ab7057df84be,network=Network(545a57d8-9d55-4ace-a0ad-635d7bc0ae52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5287c61f-56') {{(pid=71605) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 16:04:50 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:50 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5287c61f-56, bridge=br-int, if_exists=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:04:50 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:50 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:50 user nova-compute[71605]: INFO os_vif [None req-a863b487-23ba-40d2-9af2-26600527c08e tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0c:45:0b,bridge_name='br-int',has_traffic_filtering=True,id=5287c61f-56b9-4a9f-87e7-ab7057df84be,network=Network(545a57d8-9d55-4ace-a0ad-635d7bc0ae52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5287c61f-56') Apr 20 16:04:50 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-a863b487-23ba-40d2-9af2-26600527c08e tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Deleting instance files /opt/stack/data/nova/instances/5bda996a-1bfe-4f43-aa02-36a864153588_del Apr 20 16:04:50 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-a863b487-23ba-40d2-9af2-26600527c08e tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Deletion of /opt/stack/data/nova/instances/5bda996a-1bfe-4f43-aa02-36a864153588_del complete Apr 20 16:04:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bed97145-0d0b-4c2f-98f8-f1496db2ba1b tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "a5e68386-3b32-458b-9808-797d041c2235" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 3.796s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:04:50 user nova-compute[71605]: DEBUG nova.network.neutron [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Successfully updated port: 14dcc4ff-4a09-446a-b0ea-d9989cd3fa16 {{(pid=71605) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 16:04:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Acquiring lock "refresh_cache-dd78d74a-11d6-4f06-8092-5088b3fad412" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:04:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Acquired lock "refresh_cache-dd78d74a-11d6-4f06-8092-5088b3fad412" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:04:50 user nova-compute[71605]: DEBUG nova.network.neutron [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Building network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 16:04:50 user nova-compute[71605]: INFO nova.compute.manager [None req-a863b487-23ba-40d2-9af2-26600527c08e tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Took 0.86 seconds to destroy the instance on the hypervisor. Apr 20 16:04:50 user nova-compute[71605]: DEBUG oslo.service.loopingcall [None req-a863b487-23ba-40d2-9af2-26600527c08e tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71605) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 16:04:50 user nova-compute[71605]: DEBUG nova.compute.manager [-] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Deallocating network for instance {{(pid=71605) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 16:04:50 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] deallocate_for_instance() {{(pid=71605) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 16:04:50 user nova-compute[71605]: DEBUG nova.network.neutron [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Instance cache missing network info. {{(pid=71605) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 16:04:50 user nova-compute[71605]: DEBUG nova.network.neutron [req-9b78b4c3-4356-410c-bfe4-67399656cb0e req-7a5fcc3f-a425-47dc-9a1c-f0add6a8fb79 service nova] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Updated VIF entry in instance network info cache for port 5287c61f-56b9-4a9f-87e7-ab7057df84be. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:04:50 user nova-compute[71605]: DEBUG nova.network.neutron [req-9b78b4c3-4356-410c-bfe4-67399656cb0e req-7a5fcc3f-a425-47dc-9a1c-f0add6a8fb79 service nova] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Updating instance_info_cache with network_info: [{"id": "5287c61f-56b9-4a9f-87e7-ab7057df84be", "address": "fa:16:3e:0c:45:0b", "network": {"id": "545a57d8-9d55-4ace-a0ad-635d7bc0ae52", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1085059550-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.59", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0a5eb3796a4d3a871843f409c6ffbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5287c61f-56", "ovs_interfaceid": "5287c61f-56b9-4a9f-87e7-ab7057df84be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:04:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-9b78b4c3-4356-410c-bfe4-67399656cb0e req-7a5fcc3f-a425-47dc-9a1c-f0add6a8fb79 service nova] Releasing lock "refresh_cache-5bda996a-1bfe-4f43-aa02-36a864153588" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG nova.network.neutron [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Updating instance_info_cache with network_info: [{"id": "14dcc4ff-4a09-446a-b0ea-d9989cd3fa16", "address": "fa:16:3e:7e:10:bd", "network": {"id": "c36830a6-66f7-4f28-8879-e228da46cead", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-655574662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8444d3c8f554a56967917670b19dc37", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap14dcc4ff-4a", "ovs_interfaceid": "14dcc4ff-4a09-446a-b0ea-d9989cd3fa16", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Releasing lock "refresh_cache-dd78d74a-11d6-4f06-8092-5088b3fad412" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG nova.compute.manager [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Instance network_info: |[{"id": "14dcc4ff-4a09-446a-b0ea-d9989cd3fa16", "address": "fa:16:3e:7e:10:bd", "network": {"id": "c36830a6-66f7-4f28-8879-e228da46cead", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-655574662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8444d3c8f554a56967917670b19dc37", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap14dcc4ff-4a", "ovs_interfaceid": "14dcc4ff-4a09-446a-b0ea-d9989cd3fa16", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Start _get_guest_xml network_info=[{"id": "14dcc4ff-4a09-446a-b0ea-d9989cd3fa16", "address": "fa:16:3e:7e:10:bd", "network": {"id": "c36830a6-66f7-4f28-8879-e228da46cead", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-655574662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8444d3c8f554a56967917670b19dc37", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap14dcc4ff-4a", "ovs_interfaceid": "14dcc4ff-4a09-446a-b0ea-d9989cd3fa16", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'size': 0, 'encryption_format': None, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'image_id': '4ac69ea5-e5d7-40c8-864e-0a164d78a727'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 16:04:51 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:04:51 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:04:51 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71605) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T16:00:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=), allow threads: True {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Flavor limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Image limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Flavor pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Image pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Got 1 possible topologies {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:04:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1136804593',display_name='tempest-ServersNegativeTestJSON-server-1136804593',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1136804593',id=11,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d8444d3c8f554a56967917670b19dc37',ramdisk_id='',reservation_id='r-osnh458p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-942369263',owner_user_name='tempest-ServersNegativeTestJSON-942369263-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:04:46Z,user_data=None,user_id='9be25e958c6047068ab5ce63106b0754',uuid=dd78d74a-11d6-4f06-8092-5088b3fad412,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "14dcc4ff-4a09-446a-b0ea-d9989cd3fa16", "address": "fa:16:3e:7e:10:bd", "network": {"id": "c36830a6-66f7-4f28-8879-e228da46cead", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-655574662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8444d3c8f554a56967917670b19dc37", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap14dcc4ff-4a", "ovs_interfaceid": "14dcc4ff-4a09-446a-b0ea-d9989cd3fa16", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71605) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Converting VIF {"id": "14dcc4ff-4a09-446a-b0ea-d9989cd3fa16", "address": "fa:16:3e:7e:10:bd", "network": {"id": "c36830a6-66f7-4f28-8879-e228da46cead", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-655574662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8444d3c8f554a56967917670b19dc37", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap14dcc4ff-4a", "ovs_interfaceid": "14dcc4ff-4a09-446a-b0ea-d9989cd3fa16", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7e:10:bd,bridge_name='br-int',has_traffic_filtering=True,id=14dcc4ff-4a09-446a-b0ea-d9989cd3fa16,network=Network(c36830a6-66f7-4f28-8879-e228da46cead),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14dcc4ff-4a') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG nova.objects.instance [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lazy-loading 'pci_devices' on Instance uuid dd78d74a-11d6-4f06-8092-5088b3fad412 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] End _get_guest_xml xml= Apr 20 16:04:51 user nova-compute[71605]: dd78d74a-11d6-4f06-8092-5088b3fad412 Apr 20 16:04:51 user nova-compute[71605]: instance-0000000b Apr 20 16:04:51 user nova-compute[71605]: 131072 Apr 20 16:04:51 user nova-compute[71605]: 1 Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: tempest-ServersNegativeTestJSON-server-1136804593 Apr 20 16:04:51 user nova-compute[71605]: 2023-04-20 16:04:51 Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: 128 Apr 20 16:04:51 user nova-compute[71605]: 1 Apr 20 16:04:51 user nova-compute[71605]: 0 Apr 20 16:04:51 user nova-compute[71605]: 0 Apr 20 16:04:51 user nova-compute[71605]: 1 Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: tempest-ServersNegativeTestJSON-942369263-project-member Apr 20 16:04:51 user nova-compute[71605]: tempest-ServersNegativeTestJSON-942369263 Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: OpenStack Foundation Apr 20 16:04:51 user nova-compute[71605]: OpenStack Nova Apr 20 16:04:51 user nova-compute[71605]: 0.0.0 Apr 20 16:04:51 user nova-compute[71605]: dd78d74a-11d6-4f06-8092-5088b3fad412 Apr 20 16:04:51 user nova-compute[71605]: dd78d74a-11d6-4f06-8092-5088b3fad412 Apr 20 16:04:51 user nova-compute[71605]: Virtual Machine Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: hvm Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Nehalem Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: /dev/urandom Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: Apr 20 16:04:51 user nova-compute[71605]: {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:04:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1136804593',display_name='tempest-ServersNegativeTestJSON-server-1136804593',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1136804593',id=11,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d8444d3c8f554a56967917670b19dc37',ramdisk_id='',reservation_id='r-osnh458p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-942369263',owner_user_name='tempest-ServersNegativeTestJSON-942369263-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:04:46Z,user_data=None,user_id='9be25e958c6047068ab5ce63106b0754',uuid=dd78d74a-11d6-4f06-8092-5088b3fad412,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "14dcc4ff-4a09-446a-b0ea-d9989cd3fa16", "address": "fa:16:3e:7e:10:bd", "network": {"id": "c36830a6-66f7-4f28-8879-e228da46cead", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-655574662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8444d3c8f554a56967917670b19dc37", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap14dcc4ff-4a", "ovs_interfaceid": "14dcc4ff-4a09-446a-b0ea-d9989cd3fa16", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Converting VIF {"id": "14dcc4ff-4a09-446a-b0ea-d9989cd3fa16", "address": "fa:16:3e:7e:10:bd", "network": {"id": "c36830a6-66f7-4f28-8879-e228da46cead", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-655574662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8444d3c8f554a56967917670b19dc37", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap14dcc4ff-4a", "ovs_interfaceid": "14dcc4ff-4a09-446a-b0ea-d9989cd3fa16", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7e:10:bd,bridge_name='br-int',has_traffic_filtering=True,id=14dcc4ff-4a09-446a-b0ea-d9989cd3fa16,network=Network(c36830a6-66f7-4f28-8879-e228da46cead),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14dcc4ff-4a') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG os_vif [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7e:10:bd,bridge_name='br-int',has_traffic_filtering=True,id=14dcc4ff-4a09-446a-b0ea-d9989cd3fa16,network=Network(c36830a6-66f7-4f28-8879-e228da46cead),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14dcc4ff-4a') {{(pid=71605) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 16:04:51 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-3cb2fc70-596e-415f-ac12-97d5c168e6dc tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Snapshot image upload complete Apr 20 16:04:51 user nova-compute[71605]: INFO nova.compute.manager [None req-3cb2fc70-596e-415f-ac12-97d5c168e6dc tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Took 4.90 seconds to snapshot the instance on the hypervisor. Apr 20 16:04:51 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap14dcc4ff-4a, may_exist=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap14dcc4ff-4a, col_values=(('external_ids', {'iface-id': '14dcc4ff-4a09-446a-b0ea-d9989cd3fa16', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7e:10:bd', 'vm-uuid': 'dd78d74a-11d6-4f06-8092-5088b3fad412'}),)) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:51 user nova-compute[71605]: INFO os_vif [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7e:10:bd,bridge_name='br-int',has_traffic_filtering=True,id=14dcc4ff-4a09-446a-b0ea-d9989cd3fa16,network=Network(c36830a6-66f7-4f28-8879-e228da46cead),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14dcc4ff-4a') Apr 20 16:04:51 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] No BDM found with device name vda, not building metadata. {{(pid=71605) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] No VIF found with MAC fa:16:3e:7e:10:bd, not building metadata {{(pid=71605) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 16:04:51 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:04:51 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Took 1.31 seconds to deallocate network for instance. Apr 20 16:04:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a863b487-23ba-40d2-9af2-26600527c08e tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:04:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a863b487-23ba-40d2-9af2-26600527c08e tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:04:52 user nova-compute[71605]: DEBUG nova.compute.manager [req-8144e443-b9bf-41b4-a8e7-c2fb6c60b613 req-77b76ce3-c633-4bbf-84e1-536a55725df9 service nova] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Received event network-vif-unplugged-5287c61f-56b9-4a9f-87e7-ab7057df84be {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:04:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-8144e443-b9bf-41b4-a8e7-c2fb6c60b613 req-77b76ce3-c633-4bbf-84e1-536a55725df9 service nova] Acquiring lock "5bda996a-1bfe-4f43-aa02-36a864153588-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:04:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-8144e443-b9bf-41b4-a8e7-c2fb6c60b613 req-77b76ce3-c633-4bbf-84e1-536a55725df9 service nova] Lock "5bda996a-1bfe-4f43-aa02-36a864153588-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:04:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-8144e443-b9bf-41b4-a8e7-c2fb6c60b613 req-77b76ce3-c633-4bbf-84e1-536a55725df9 service nova] Lock "5bda996a-1bfe-4f43-aa02-36a864153588-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:04:52 user nova-compute[71605]: DEBUG nova.compute.manager [req-8144e443-b9bf-41b4-a8e7-c2fb6c60b613 req-77b76ce3-c633-4bbf-84e1-536a55725df9 service nova] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] No waiting events found dispatching network-vif-unplugged-5287c61f-56b9-4a9f-87e7-ab7057df84be {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:04:52 user nova-compute[71605]: WARNING nova.compute.manager [req-8144e443-b9bf-41b4-a8e7-c2fb6c60b613 req-77b76ce3-c633-4bbf-84e1-536a55725df9 service nova] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Received unexpected event network-vif-unplugged-5287c61f-56b9-4a9f-87e7-ab7057df84be for instance with vm_state deleted and task_state None. Apr 20 16:04:52 user nova-compute[71605]: DEBUG nova.compute.manager [req-8144e443-b9bf-41b4-a8e7-c2fb6c60b613 req-77b76ce3-c633-4bbf-84e1-536a55725df9 service nova] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Received event network-changed-14dcc4ff-4a09-446a-b0ea-d9989cd3fa16 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:04:52 user nova-compute[71605]: DEBUG nova.compute.manager [req-8144e443-b9bf-41b4-a8e7-c2fb6c60b613 req-77b76ce3-c633-4bbf-84e1-536a55725df9 service nova] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Refreshing instance network info cache due to event network-changed-14dcc4ff-4a09-446a-b0ea-d9989cd3fa16. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:04:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-8144e443-b9bf-41b4-a8e7-c2fb6c60b613 req-77b76ce3-c633-4bbf-84e1-536a55725df9 service nova] Acquiring lock "refresh_cache-dd78d74a-11d6-4f06-8092-5088b3fad412" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:04:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-8144e443-b9bf-41b4-a8e7-c2fb6c60b613 req-77b76ce3-c633-4bbf-84e1-536a55725df9 service nova] Acquired lock "refresh_cache-dd78d74a-11d6-4f06-8092-5088b3fad412" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:04:52 user nova-compute[71605]: DEBUG nova.network.neutron [req-8144e443-b9bf-41b4-a8e7-c2fb6c60b613 req-77b76ce3-c633-4bbf-84e1-536a55725df9 service nova] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Refreshing network info cache for port 14dcc4ff-4a09-446a-b0ea-d9989cd3fa16 {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:04:52 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-a863b487-23ba-40d2-9af2-26600527c08e tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:04:52 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-a863b487-23ba-40d2-9af2-26600527c08e tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:04:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a863b487-23ba-40d2-9af2-26600527c08e tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.302s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:04:52 user nova-compute[71605]: INFO nova.scheduler.client.report [None req-a863b487-23ba-40d2-9af2-26600527c08e tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Deleted allocations for instance 5bda996a-1bfe-4f43-aa02-36a864153588 Apr 20 16:04:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a863b487-23ba-40d2-9af2-26600527c08e tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "5bda996a-1bfe-4f43-aa02-36a864153588" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.658s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:04:52 user nova-compute[71605]: DEBUG nova.network.neutron [req-8144e443-b9bf-41b4-a8e7-c2fb6c60b613 req-77b76ce3-c633-4bbf-84e1-536a55725df9 service nova] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Updated VIF entry in instance network info cache for port 14dcc4ff-4a09-446a-b0ea-d9989cd3fa16. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:04:52 user nova-compute[71605]: DEBUG nova.network.neutron [req-8144e443-b9bf-41b4-a8e7-c2fb6c60b613 req-77b76ce3-c633-4bbf-84e1-536a55725df9 service nova] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Updating instance_info_cache with network_info: [{"id": "14dcc4ff-4a09-446a-b0ea-d9989cd3fa16", "address": "fa:16:3e:7e:10:bd", "network": {"id": "c36830a6-66f7-4f28-8879-e228da46cead", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-655574662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8444d3c8f554a56967917670b19dc37", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap14dcc4ff-4a", "ovs_interfaceid": "14dcc4ff-4a09-446a-b0ea-d9989cd3fa16", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:04:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-8144e443-b9bf-41b4-a8e7-c2fb6c60b613 req-77b76ce3-c633-4bbf-84e1-536a55725df9 service nova] Releasing lock "refresh_cache-dd78d74a-11d6-4f06-8092-5088b3fad412" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:04:52 user nova-compute[71605]: DEBUG nova.compute.manager [req-8144e443-b9bf-41b4-a8e7-c2fb6c60b613 req-77b76ce3-c633-4bbf-84e1-536a55725df9 service nova] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Received event network-vif-plugged-5287c61f-56b9-4a9f-87e7-ab7057df84be {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:04:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-8144e443-b9bf-41b4-a8e7-c2fb6c60b613 req-77b76ce3-c633-4bbf-84e1-536a55725df9 service nova] Acquiring lock "5bda996a-1bfe-4f43-aa02-36a864153588-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:04:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-8144e443-b9bf-41b4-a8e7-c2fb6c60b613 req-77b76ce3-c633-4bbf-84e1-536a55725df9 service nova] Lock "5bda996a-1bfe-4f43-aa02-36a864153588-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:04:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-8144e443-b9bf-41b4-a8e7-c2fb6c60b613 req-77b76ce3-c633-4bbf-84e1-536a55725df9 service nova] Lock "5bda996a-1bfe-4f43-aa02-36a864153588-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:04:52 user nova-compute[71605]: DEBUG nova.compute.manager [req-8144e443-b9bf-41b4-a8e7-c2fb6c60b613 req-77b76ce3-c633-4bbf-84e1-536a55725df9 service nova] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] No waiting events found dispatching network-vif-plugged-5287c61f-56b9-4a9f-87e7-ab7057df84be {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:04:52 user nova-compute[71605]: WARNING nova.compute.manager [req-8144e443-b9bf-41b4-a8e7-c2fb6c60b613 req-77b76ce3-c633-4bbf-84e1-536a55725df9 service nova] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Received unexpected event network-vif-plugged-5287c61f-56b9-4a9f-87e7-ab7057df84be for instance with vm_state deleted and task_state None. Apr 20 16:04:52 user nova-compute[71605]: DEBUG nova.compute.manager [req-8144e443-b9bf-41b4-a8e7-c2fb6c60b613 req-77b76ce3-c633-4bbf-84e1-536a55725df9 service nova] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Received event network-vif-deleted-5287c61f-56b9-4a9f-87e7-ab7057df84be {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:04:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:53 user nova-compute[71605]: DEBUG nova.compute.manager [req-197a310c-7bf6-433b-9619-a1314ab51358 req-d7a13878-4d1c-477c-8815-c0d7384ee3ec service nova] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Received event network-vif-plugged-14dcc4ff-4a09-446a-b0ea-d9989cd3fa16 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:04:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-197a310c-7bf6-433b-9619-a1314ab51358 req-d7a13878-4d1c-477c-8815-c0d7384ee3ec service nova] Acquiring lock "dd78d74a-11d6-4f06-8092-5088b3fad412-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:04:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-197a310c-7bf6-433b-9619-a1314ab51358 req-d7a13878-4d1c-477c-8815-c0d7384ee3ec service nova] Lock "dd78d74a-11d6-4f06-8092-5088b3fad412-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:04:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-197a310c-7bf6-433b-9619-a1314ab51358 req-d7a13878-4d1c-477c-8815-c0d7384ee3ec service nova] Lock "dd78d74a-11d6-4f06-8092-5088b3fad412-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:04:53 user nova-compute[71605]: DEBUG nova.compute.manager [req-197a310c-7bf6-433b-9619-a1314ab51358 req-d7a13878-4d1c-477c-8815-c0d7384ee3ec service nova] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] No waiting events found dispatching network-vif-plugged-14dcc4ff-4a09-446a-b0ea-d9989cd3fa16 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:04:53 user nova-compute[71605]: WARNING nova.compute.manager [req-197a310c-7bf6-433b-9619-a1314ab51358 req-d7a13878-4d1c-477c-8815-c0d7384ee3ec service nova] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Received unexpected event network-vif-plugged-14dcc4ff-4a09-446a-b0ea-d9989cd3fa16 for instance with vm_state building and task_state spawning. Apr 20 16:04:53 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:53 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:04:55 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Resumed> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:04:55 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] VM Resumed (Lifecycle Event) Apr 20 16:04:55 user nova-compute[71605]: DEBUG nova.compute.manager [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Instance event wait completed in 0 seconds for {{(pid=71605) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 16:04:55 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Guest created on hypervisor {{(pid=71605) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 16:04:55 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Instance spawned successfully. Apr 20 16:04:55 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 16:04:55 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:04:55 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:04:55 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Found default for hw_cdrom_bus of ide {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:04:55 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Found default for hw_disk_bus of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:04:55 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Found default for hw_input_bus of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:04:55 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Found default for hw_pointer_model of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:04:55 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Found default for hw_video_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:04:55 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Found default for hw_vif_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:04:55 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:04:55 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Started> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:04:55 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] VM Started (Lifecycle Event) Apr 20 16:04:55 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:04:55 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:04:55 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:04:55 user nova-compute[71605]: INFO nova.compute.manager [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Took 9.37 seconds to spawn the instance on the hypervisor. Apr 20 16:04:55 user nova-compute[71605]: DEBUG nova.compute.manager [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:04:55 user nova-compute[71605]: DEBUG nova.compute.manager [req-c75d0ba9-f81f-4afb-b87b-19bd57b2c176 req-3e9e615a-46ab-4273-8490-56905bbe2de1 service nova] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Received event network-vif-plugged-14dcc4ff-4a09-446a-b0ea-d9989cd3fa16 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:04:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-c75d0ba9-f81f-4afb-b87b-19bd57b2c176 req-3e9e615a-46ab-4273-8490-56905bbe2de1 service nova] Acquiring lock "dd78d74a-11d6-4f06-8092-5088b3fad412-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:04:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-c75d0ba9-f81f-4afb-b87b-19bd57b2c176 req-3e9e615a-46ab-4273-8490-56905bbe2de1 service nova] Lock "dd78d74a-11d6-4f06-8092-5088b3fad412-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:04:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-c75d0ba9-f81f-4afb-b87b-19bd57b2c176 req-3e9e615a-46ab-4273-8490-56905bbe2de1 service nova] Lock "dd78d74a-11d6-4f06-8092-5088b3fad412-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:04:55 user nova-compute[71605]: DEBUG nova.compute.manager [req-c75d0ba9-f81f-4afb-b87b-19bd57b2c176 req-3e9e615a-46ab-4273-8490-56905bbe2de1 service nova] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] No waiting events found dispatching network-vif-plugged-14dcc4ff-4a09-446a-b0ea-d9989cd3fa16 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:04:55 user nova-compute[71605]: WARNING nova.compute.manager [req-c75d0ba9-f81f-4afb-b87b-19bd57b2c176 req-3e9e615a-46ab-4273-8490-56905bbe2de1 service nova] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Received unexpected event network-vif-plugged-14dcc4ff-4a09-446a-b0ea-d9989cd3fa16 for instance with vm_state building and task_state spawning. Apr 20 16:04:55 user nova-compute[71605]: INFO nova.compute.manager [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Took 10.18 seconds to build instance. Apr 20 16:04:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-410e2754-fabc-4f13-b7f7-b37ec50815cd tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "dd78d74a-11d6-4f06-8092-5088b3fad412" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.309s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:04:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:00 user nova-compute[71605]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:05:00 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] VM Stopped (Lifecycle Event) Apr 20 16:05:00 user nova-compute[71605]: DEBUG nova.compute.manager [None req-c2a1b7f7-507e-49cf-9959-d68edb5a9a8e None None] [instance: 6d55e5bd-9b03-40a9-bca9-88545039597c] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:05:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:05:02 user nova-compute[71605]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:05:02 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: a5e68386-3b32-458b-9808-797d041c2235] VM Stopped (Lifecycle Event) Apr 20 16:05:02 user nova-compute[71605]: DEBUG nova.compute.manager [None req-beae6d24-2395-431d-b8db-047b1ae8ef0c None None] [instance: a5e68386-3b32-458b-9808-797d041c2235] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:05:04 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:05 user nova-compute[71605]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:05:05 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] VM Stopped (Lifecycle Event) Apr 20 16:05:05 user nova-compute[71605]: DEBUG nova.compute.manager [None req-53b75945-c04e-4fb5-bb79-9bb7aa80e2a6 None None] [instance: 5bda996a-1bfe-4f43-aa02-36a864153588] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:05:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:09 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:14 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:15 user nova-compute[71605]: DEBUG nova.compute.manager [req-14f6d381-1bbd-473a-aa6a-5e3cc04e9b44 req-56255606-d34a-44bf-98df-1f9ec2459259 service nova] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Received event network-changed-5c711d7a-9f6d-49dd-af46-c3c1056f702e {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:05:15 user nova-compute[71605]: DEBUG nova.compute.manager [req-14f6d381-1bbd-473a-aa6a-5e3cc04e9b44 req-56255606-d34a-44bf-98df-1f9ec2459259 service nova] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Refreshing instance network info cache due to event network-changed-5c711d7a-9f6d-49dd-af46-c3c1056f702e. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:05:15 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-14f6d381-1bbd-473a-aa6a-5e3cc04e9b44 req-56255606-d34a-44bf-98df-1f9ec2459259 service nova] Acquiring lock "refresh_cache-65fc650d-2181-46cb-b91b-4a1104b2afab" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:05:15 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-14f6d381-1bbd-473a-aa6a-5e3cc04e9b44 req-56255606-d34a-44bf-98df-1f9ec2459259 service nova] Acquired lock "refresh_cache-65fc650d-2181-46cb-b91b-4a1104b2afab" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:05:15 user nova-compute[71605]: DEBUG nova.network.neutron [req-14f6d381-1bbd-473a-aa6a-5e3cc04e9b44 req-56255606-d34a-44bf-98df-1f9ec2459259 service nova] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Refreshing network info cache for port 5c711d7a-9f6d-49dd-af46-c3c1056f702e {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:05:15 user nova-compute[71605]: DEBUG nova.network.neutron [req-14f6d381-1bbd-473a-aa6a-5e3cc04e9b44 req-56255606-d34a-44bf-98df-1f9ec2459259 service nova] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Updated VIF entry in instance network info cache for port 5c711d7a-9f6d-49dd-af46-c3c1056f702e. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:05:15 user nova-compute[71605]: DEBUG nova.network.neutron [req-14f6d381-1bbd-473a-aa6a-5e3cc04e9b44 req-56255606-d34a-44bf-98df-1f9ec2459259 service nova] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Updating instance_info_cache with network_info: [{"id": "5c711d7a-9f6d-49dd-af46-c3c1056f702e", "address": "fa:16:3e:70:93:1c", "network": {"id": "0bc5d911-da2e-4f4e-9427-2332d7a5bd08", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-494297859-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.30", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "132831801cee4fb185cc27c9792ff5ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c711d7a-9f", "ovs_interfaceid": "5c711d7a-9f6d-49dd-af46-c3c1056f702e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:05:15 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-14f6d381-1bbd-473a-aa6a-5e3cc04e9b44 req-56255606-d34a-44bf-98df-1f9ec2459259 service nova] Releasing lock "refresh_cache-65fc650d-2181-46cb-b91b-4a1104b2afab" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:05:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:16 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b0877cda-221a-4856-922b-214df6b5b016 tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Acquiring lock "65fc650d-2181-46cb-b91b-4a1104b2afab" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:16 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b0877cda-221a-4856-922b-214df6b5b016 tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Lock "65fc650d-2181-46cb-b91b-4a1104b2afab" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:16 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b0877cda-221a-4856-922b-214df6b5b016 tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Acquiring lock "65fc650d-2181-46cb-b91b-4a1104b2afab-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:16 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b0877cda-221a-4856-922b-214df6b5b016 tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Lock "65fc650d-2181-46cb-b91b-4a1104b2afab-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:16 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b0877cda-221a-4856-922b-214df6b5b016 tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Lock "65fc650d-2181-46cb-b91b-4a1104b2afab-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:16 user nova-compute[71605]: INFO nova.compute.manager [None req-b0877cda-221a-4856-922b-214df6b5b016 tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Terminating instance Apr 20 16:05:16 user nova-compute[71605]: DEBUG nova.compute.manager [None req-b0877cda-221a-4856-922b-214df6b5b016 tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Start destroying the instance on the hypervisor. {{(pid=71605) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 16:05:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:17 user nova-compute[71605]: DEBUG nova.compute.manager [req-51afaac4-c8e9-42da-917d-f8f58e6c9794 req-3e793430-1de1-47bf-a52e-9b4cdeed743c service nova] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Received event network-vif-unplugged-5c711d7a-9f6d-49dd-af46-c3c1056f702e {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:05:17 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-51afaac4-c8e9-42da-917d-f8f58e6c9794 req-3e793430-1de1-47bf-a52e-9b4cdeed743c service nova] Acquiring lock "65fc650d-2181-46cb-b91b-4a1104b2afab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:17 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-51afaac4-c8e9-42da-917d-f8f58e6c9794 req-3e793430-1de1-47bf-a52e-9b4cdeed743c service nova] Lock "65fc650d-2181-46cb-b91b-4a1104b2afab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:17 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-51afaac4-c8e9-42da-917d-f8f58e6c9794 req-3e793430-1de1-47bf-a52e-9b4cdeed743c service nova] Lock "65fc650d-2181-46cb-b91b-4a1104b2afab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:17 user nova-compute[71605]: DEBUG nova.compute.manager [req-51afaac4-c8e9-42da-917d-f8f58e6c9794 req-3e793430-1de1-47bf-a52e-9b4cdeed743c service nova] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] No waiting events found dispatching network-vif-unplugged-5c711d7a-9f6d-49dd-af46-c3c1056f702e {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:05:17 user nova-compute[71605]: DEBUG nova.compute.manager [req-51afaac4-c8e9-42da-917d-f8f58e6c9794 req-3e793430-1de1-47bf-a52e-9b4cdeed743c service nova] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Received event network-vif-unplugged-5c711d7a-9f6d-49dd-af46-c3c1056f702e for instance with task_state deleting. {{(pid=71605) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 16:05:17 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Instance destroyed successfully. Apr 20 16:05:17 user nova-compute[71605]: DEBUG nova.objects.instance [None req-b0877cda-221a-4856-922b-214df6b5b016 tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Lazy-loading 'resources' on Instance uuid 65fc650d-2181-46cb-b91b-4a1104b2afab {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:05:17 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-b0877cda-221a-4856-922b-214df6b5b016 tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2023-04-20T16:03:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-1701348631',display_name='tempest-AttachSCSIVolumeTestJSON-server-1701348631',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-1701348631',id=6,image_ref='4c26d9f3-9ee3-471e-b427-e25c3c09175c',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD2/JSQECxpyAoht39JYGnv52skhhDoF+ZMQJeVxL2a6UlTIckPD/ph8VMozU2wYXOiMIRgZDapWk23cxn+Rk7SbiF9E3tzwwP5mxsK4xqXHETPbeDxqHRE+MDclya79IQ==',key_name='tempest-keypair-1188279305',keypairs=,launch_index=0,launched_at=2023-04-20T16:03:34Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='132831801cee4fb185cc27c9792ff5ad',ramdisk_id='',reservation_id='r-7xqvsqeu',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4c26d9f3-9ee3-471e-b427-e25c3c09175c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_scsi_model='virtio-scsi',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachSCSIVolumeTestJSON-838012861',owner_user_name='tempest-AttachSCSIVolumeTestJSON-838012861-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T16:03:34Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='b21609ce02ce4ed2ba4f8f5d668da192',uuid=65fc650d-2181-46cb-b91b-4a1104b2afab,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5c711d7a-9f6d-49dd-af46-c3c1056f702e", "address": "fa:16:3e:70:93:1c", "network": {"id": "0bc5d911-da2e-4f4e-9427-2332d7a5bd08", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-494297859-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.30", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "132831801cee4fb185cc27c9792ff5ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c711d7a-9f", "ovs_interfaceid": "5c711d7a-9f6d-49dd-af46-c3c1056f702e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 16:05:17 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-b0877cda-221a-4856-922b-214df6b5b016 tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Converting VIF {"id": "5c711d7a-9f6d-49dd-af46-c3c1056f702e", "address": "fa:16:3e:70:93:1c", "network": {"id": "0bc5d911-da2e-4f4e-9427-2332d7a5bd08", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-494297859-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.30", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "132831801cee4fb185cc27c9792ff5ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5c711d7a-9f", "ovs_interfaceid": "5c711d7a-9f6d-49dd-af46-c3c1056f702e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:05:17 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-b0877cda-221a-4856-922b-214df6b5b016 tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:70:93:1c,bridge_name='br-int',has_traffic_filtering=True,id=5c711d7a-9f6d-49dd-af46-c3c1056f702e,network=Network(0bc5d911-da2e-4f4e-9427-2332d7a5bd08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c711d7a-9f') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:05:17 user nova-compute[71605]: DEBUG os_vif [None req-b0877cda-221a-4856-922b-214df6b5b016 tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:70:93:1c,bridge_name='br-int',has_traffic_filtering=True,id=5c711d7a-9f6d-49dd-af46-c3c1056f702e,network=Network(0bc5d911-da2e-4f4e-9427-2332d7a5bd08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c711d7a-9f') {{(pid=71605) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 16:05:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5c711d7a-9f, bridge=br-int, if_exists=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:05:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:17 user nova-compute[71605]: INFO os_vif [None req-b0877cda-221a-4856-922b-214df6b5b016 tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:70:93:1c,bridge_name='br-int',has_traffic_filtering=True,id=5c711d7a-9f6d-49dd-af46-c3c1056f702e,network=Network(0bc5d911-da2e-4f4e-9427-2332d7a5bd08),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5c711d7a-9f') Apr 20 16:05:17 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-b0877cda-221a-4856-922b-214df6b5b016 tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Deleting instance files /opt/stack/data/nova/instances/65fc650d-2181-46cb-b91b-4a1104b2afab_del Apr 20 16:05:17 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-b0877cda-221a-4856-922b-214df6b5b016 tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Deletion of /opt/stack/data/nova/instances/65fc650d-2181-46cb-b91b-4a1104b2afab_del complete Apr 20 16:05:17 user nova-compute[71605]: INFO nova.compute.manager [None req-b0877cda-221a-4856-922b-214df6b5b016 tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Took 0.66 seconds to destroy the instance on the hypervisor. Apr 20 16:05:17 user nova-compute[71605]: DEBUG oslo.service.loopingcall [None req-b0877cda-221a-4856-922b-214df6b5b016 tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71605) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 16:05:17 user nova-compute[71605]: DEBUG nova.compute.manager [-] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Deallocating network for instance {{(pid=71605) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 16:05:17 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] deallocate_for_instance() {{(pid=71605) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 16:05:18 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:05:18 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Took 0.87 seconds to deallocate network for instance. Apr 20 16:05:18 user nova-compute[71605]: DEBUG nova.compute.manager [req-fb65a52b-3bfa-4bd0-ba16-2255f5d7547d req-2b51c10e-82f7-47c3-87e6-358cb846d164 service nova] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Received event network-vif-deleted-5c711d7a-9f6d-49dd-af46-c3c1056f702e {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:05:18 user nova-compute[71605]: INFO nova.compute.manager [req-fb65a52b-3bfa-4bd0-ba16-2255f5d7547d req-2b51c10e-82f7-47c3-87e6-358cb846d164 service nova] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Neutron deleted interface 5c711d7a-9f6d-49dd-af46-c3c1056f702e; detaching it from the instance and deleting it from the info cache Apr 20 16:05:18 user nova-compute[71605]: DEBUG nova.network.neutron [req-fb65a52b-3bfa-4bd0-ba16-2255f5d7547d req-2b51c10e-82f7-47c3-87e6-358cb846d164 service nova] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:05:18 user nova-compute[71605]: DEBUG nova.compute.manager [req-fb65a52b-3bfa-4bd0-ba16-2255f5d7547d req-2b51c10e-82f7-47c3-87e6-358cb846d164 service nova] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Detach interface failed, port_id=5c711d7a-9f6d-49dd-af46-c3c1056f702e, reason: Instance 65fc650d-2181-46cb-b91b-4a1104b2afab could not be found. {{(pid=71605) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 20 16:05:18 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b0877cda-221a-4856-922b-214df6b5b016 tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:18 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b0877cda-221a-4856-922b-214df6b5b016 tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:18 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-b0877cda-221a-4856-922b-214df6b5b016 tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:05:18 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-b0877cda-221a-4856-922b-214df6b5b016 tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:05:18 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b0877cda-221a-4856-922b-214df6b5b016 tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.285s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:18 user nova-compute[71605]: INFO nova.scheduler.client.report [None req-b0877cda-221a-4856-922b-214df6b5b016 tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Deleted allocations for instance 65fc650d-2181-46cb-b91b-4a1104b2afab Apr 20 16:05:18 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-b0877cda-221a-4856-922b-214df6b5b016 tempest-AttachSCSIVolumeTestJSON-838012861 tempest-AttachSCSIVolumeTestJSON-838012861-project-member] Lock "65fc650d-2181-46cb-b91b-4a1104b2afab" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.009s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:19 user nova-compute[71605]: DEBUG nova.compute.manager [req-e03dd3d2-79d1-4223-980c-85a8d481e123 req-9664f284-5ecb-4d91-b8a4-3dadf54ce271 service nova] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Received event network-vif-plugged-5c711d7a-9f6d-49dd-af46-c3c1056f702e {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:05:19 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-e03dd3d2-79d1-4223-980c-85a8d481e123 req-9664f284-5ecb-4d91-b8a4-3dadf54ce271 service nova] Acquiring lock "65fc650d-2181-46cb-b91b-4a1104b2afab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:19 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-e03dd3d2-79d1-4223-980c-85a8d481e123 req-9664f284-5ecb-4d91-b8a4-3dadf54ce271 service nova] Lock "65fc650d-2181-46cb-b91b-4a1104b2afab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:19 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-e03dd3d2-79d1-4223-980c-85a8d481e123 req-9664f284-5ecb-4d91-b8a4-3dadf54ce271 service nova] Lock "65fc650d-2181-46cb-b91b-4a1104b2afab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:19 user nova-compute[71605]: DEBUG nova.compute.manager [req-e03dd3d2-79d1-4223-980c-85a8d481e123 req-9664f284-5ecb-4d91-b8a4-3dadf54ce271 service nova] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] No waiting events found dispatching network-vif-plugged-5c711d7a-9f6d-49dd-af46-c3c1056f702e {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:05:19 user nova-compute[71605]: WARNING nova.compute.manager [req-e03dd3d2-79d1-4223-980c-85a8d481e123 req-9664f284-5ecb-4d91-b8a4-3dadf54ce271 service nova] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Received unexpected event network-vif-plugged-5c711d7a-9f6d-49dd-af46-c3c1056f702e for instance with vm_state deleted and task_state None. Apr 20 16:05:19 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:24 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:27 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:29 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:32 user nova-compute[71605]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:05:32 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] VM Stopped (Lifecycle Event) Apr 20 16:05:32 user nova-compute[71605]: DEBUG nova.compute.manager [None req-dc340249-3220-4822-9c83-ae0971f61a8e None None] [instance: 65fc650d-2181-46cb-b91b-4a1104b2afab] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:05:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:35 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:37 user nova-compute[71605]: DEBUG nova.compute.manager [req-9792e975-6615-42a6-84c9-a34fd76c898a req-3fedcd34-3477-4656-adc5-a5b0912b0096 service nova] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Received event network-changed-a0d0df58-0e84-4e27-bc44-3c5983d6d23b {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:05:37 user nova-compute[71605]: DEBUG nova.compute.manager [req-9792e975-6615-42a6-84c9-a34fd76c898a req-3fedcd34-3477-4656-adc5-a5b0912b0096 service nova] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Refreshing instance network info cache due to event network-changed-a0d0df58-0e84-4e27-bc44-3c5983d6d23b. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:05:37 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-9792e975-6615-42a6-84c9-a34fd76c898a req-3fedcd34-3477-4656-adc5-a5b0912b0096 service nova] Acquiring lock "refresh_cache-e1036e0f-683f-4dfd-b0ad-6187d90ff2f6" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:05:37 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-9792e975-6615-42a6-84c9-a34fd76c898a req-3fedcd34-3477-4656-adc5-a5b0912b0096 service nova] Acquired lock "refresh_cache-e1036e0f-683f-4dfd-b0ad-6187d90ff2f6" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:05:37 user nova-compute[71605]: DEBUG nova.network.neutron [req-9792e975-6615-42a6-84c9-a34fd76c898a req-3fedcd34-3477-4656-adc5-a5b0912b0096 service nova] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Refreshing network info cache for port a0d0df58-0e84-4e27-bc44-3c5983d6d23b {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:05:37 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:37 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:38 user nova-compute[71605]: DEBUG nova.network.neutron [req-9792e975-6615-42a6-84c9-a34fd76c898a req-3fedcd34-3477-4656-adc5-a5b0912b0096 service nova] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Updated VIF entry in instance network info cache for port a0d0df58-0e84-4e27-bc44-3c5983d6d23b. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:05:38 user nova-compute[71605]: DEBUG nova.network.neutron [req-9792e975-6615-42a6-84c9-a34fd76c898a req-3fedcd34-3477-4656-adc5-a5b0912b0096 service nova] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Updating instance_info_cache with network_info: [{"id": "a0d0df58-0e84-4e27-bc44-3c5983d6d23b", "address": "fa:16:3e:96:25:ea", "network": {"id": "27275346-fa92-4114-a62b-d59f0212eb8f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-871140467-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.65", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "77f831070f5847bda788f6f0fcfedb03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0d0df58-0e", "ovs_interfaceid": "a0d0df58-0e84-4e27-bc44-3c5983d6d23b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:05:38 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-9792e975-6615-42a6-84c9-a34fd76c898a req-3fedcd34-3477-4656-adc5-a5b0912b0096 service nova] Releasing lock "refresh_cache-e1036e0f-683f-4dfd-b0ad-6187d90ff2f6" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:05:38 user nova-compute[71605]: INFO nova.compute.manager [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Rescuing Apr 20 16:05:38 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Acquiring lock "refresh_cache-fe0bde76-a4f8-4865-91af-2bd3790587a7" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:05:38 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Acquired lock "refresh_cache-fe0bde76-a4f8-4865-91af-2bd3790587a7" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:05:38 user nova-compute[71605]: DEBUG nova.network.neutron [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Building network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 16:05:39 user nova-compute[71605]: DEBUG nova.network.neutron [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Updating instance_info_cache with network_info: [{"id": "9f4d2191-16c0-4ab6-a4bd-f016499a9aad", "address": "fa:16:3e:dd:52:dd", "network": {"id": "4b5db782-8dbb-4f06-8e98-a794013dbc8c", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1330432693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d4a73ba128147f295bf6a4545fede47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f4d2191-16", "ovs_interfaceid": "9f4d2191-16c0-4ab6-a4bd-f016499a9aad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:05:39 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:05:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Releasing lock "refresh_cache-fe0bde76-a4f8-4865-91af-2bd3790587a7" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:05:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:39 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:05:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-03d3aa40-47e4-46c3-93d1-c8b07ad5d339 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Acquiring lock "e1036e0f-683f-4dfd-b0ad-6187d90ff2f6" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-03d3aa40-47e4-46c3-93d1-c8b07ad5d339 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "e1036e0f-683f-4dfd-b0ad-6187d90ff2f6" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-03d3aa40-47e4-46c3-93d1-c8b07ad5d339 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Acquiring lock "e1036e0f-683f-4dfd-b0ad-6187d90ff2f6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-03d3aa40-47e4-46c3-93d1-c8b07ad5d339 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "e1036e0f-683f-4dfd-b0ad-6187d90ff2f6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-03d3aa40-47e4-46c3-93d1-c8b07ad5d339 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "e1036e0f-683f-4dfd-b0ad-6187d90ff2f6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:39 user nova-compute[71605]: INFO nova.compute.manager [None req-03d3aa40-47e4-46c3-93d1-c8b07ad5d339 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Terminating instance Apr 20 16:05:39 user nova-compute[71605]: DEBUG nova.compute.manager [None req-03d3aa40-47e4-46c3-93d1-c8b07ad5d339 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Start destroying the instance on the hypervisor. {{(pid=71605) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 16:05:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:39 user nova-compute[71605]: DEBUG nova.compute.manager [req-eeb1ca6e-960f-4b23-a6e2-7e6321872946 req-a081f9a2-54e3-49e6-88a6-eb26635802d2 service nova] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Received event network-vif-unplugged-a0d0df58-0e84-4e27-bc44-3c5983d6d23b {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:05:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-eeb1ca6e-960f-4b23-a6e2-7e6321872946 req-a081f9a2-54e3-49e6-88a6-eb26635802d2 service nova] Acquiring lock "e1036e0f-683f-4dfd-b0ad-6187d90ff2f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-eeb1ca6e-960f-4b23-a6e2-7e6321872946 req-a081f9a2-54e3-49e6-88a6-eb26635802d2 service nova] Lock "e1036e0f-683f-4dfd-b0ad-6187d90ff2f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-eeb1ca6e-960f-4b23-a6e2-7e6321872946 req-a081f9a2-54e3-49e6-88a6-eb26635802d2 service nova] Lock "e1036e0f-683f-4dfd-b0ad-6187d90ff2f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:39 user nova-compute[71605]: DEBUG nova.compute.manager [req-eeb1ca6e-960f-4b23-a6e2-7e6321872946 req-a081f9a2-54e3-49e6-88a6-eb26635802d2 service nova] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] No waiting events found dispatching network-vif-unplugged-a0d0df58-0e84-4e27-bc44-3c5983d6d23b {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:05:39 user nova-compute[71605]: DEBUG nova.compute.manager [req-eeb1ca6e-960f-4b23-a6e2-7e6321872946 req-a081f9a2-54e3-49e6-88a6-eb26635802d2 service nova] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Received event network-vif-unplugged-a0d0df58-0e84-4e27-bc44-3c5983d6d23b for instance with task_state deleting. {{(pid=71605) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 16:05:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:40 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Instance destroyed successfully. Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.objects.instance [None req-03d3aa40-47e4-46c3-93d1-c8b07ad5d339 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lazy-loading 'resources' on Instance uuid e1036e0f-683f-4dfd-b0ad-6187d90ff2f6 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:05:40 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Instance destroyed successfully. Apr 20 16:05:40 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Attempting rescue Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} {{(pid=71605) rescue /opt/stack/nova/nova/virt/libvirt/driver.py:4289}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-03d3aa40-47e4-46c3-93d1-c8b07ad5d339 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:03:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-943462612',display_name='tempest-AttachVolumeTestJSON-server-943462612',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-943462612',id=7,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLiDjegrCZQzggT2W88rJM8apM9Us8G90ElMugxXSgu6RWOdd7UNXIA5I2rSuifsaAIZ7hdjna3OuK6N+Oig2F4ghSuSm7pUTAMo6SzF09nfKRInfS2/IkPdA5ci5VCPuw==',key_name='tempest-keypair-2058378220',keypairs=,launch_index=0,launched_at=2023-04-20T16:03:55Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='77f831070f5847bda788f6f0fcfedb03',ramdisk_id='',reservation_id='r-genr7j8b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeTestJSON-1838780462',owner_user_name='tempest-AttachVolumeTestJSON-1838780462-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T16:03:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1c8f57b12bc749888ea89bdbee258811',uuid=e1036e0f-683f-4dfd-b0ad-6187d90ff2f6,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a0d0df58-0e84-4e27-bc44-3c5983d6d23b", "address": "fa:16:3e:96:25:ea", "network": {"id": "27275346-fa92-4114-a62b-d59f0212eb8f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-871140467-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.65", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "77f831070f5847bda788f6f0fcfedb03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0d0df58-0e", "ovs_interfaceid": "a0d0df58-0e84-4e27-bc44-3c5983d6d23b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-03d3aa40-47e4-46c3-93d1-c8b07ad5d339 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Converting VIF {"id": "a0d0df58-0e84-4e27-bc44-3c5983d6d23b", "address": "fa:16:3e:96:25:ea", "network": {"id": "27275346-fa92-4114-a62b-d59f0212eb8f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-871140467-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.65", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "77f831070f5847bda788f6f0fcfedb03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0d0df58-0e", "ovs_interfaceid": "a0d0df58-0e84-4e27-bc44-3c5983d6d23b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-03d3aa40-47e4-46c3-93d1-c8b07ad5d339 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:96:25:ea,bridge_name='br-int',has_traffic_filtering=True,id=a0d0df58-0e84-4e27-bc44-3c5983d6d23b,network=Network(27275346-fa92-4114-a62b-d59f0212eb8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0d0df58-0e') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG os_vif [None req-03d3aa40-47e4-46c3-93d1-c8b07ad5d339 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:96:25:ea,bridge_name='br-int',has_traffic_filtering=True,id=a0d0df58-0e84-4e27-bc44-3c5983d6d23b,network=Network(27275346-fa92-4114-a62b-d59f0212eb8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0d0df58-0e') {{(pid=71605) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa0d0df58-0e, bridge=br-int, if_exists=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Instance directory exists: not creating {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4694}} Apr 20 16:05:40 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Creating image(s) Apr 20 16:05:40 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Acquiring lock "/opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "/opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "/opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.objects.instance [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lazy-loading 'trusted_certs' on Instance uuid fe0bde76-a4f8-4865-91af-2bd3790587a7 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:40 user nova-compute[71605]: INFO os_vif [None req-03d3aa40-47e4-46c3-93d1-c8b07ad5d339 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:96:25:ea,bridge_name='br-int',has_traffic_filtering=True,id=a0d0df58-0e84-4e27-bc44-3c5983d6d23b,network=Network(27275346-fa92-4114-a62b-d59f0212eb8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0d0df58-0e') Apr 20 16:05:40 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-03d3aa40-47e4-46c3-93d1-c8b07ad5d339 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Deleting instance files /opt/stack/data/nova/instances/e1036e0f-683f-4dfd-b0ad-6187d90ff2f6_del Apr 20 16:05:40 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-03d3aa40-47e4-46c3-93d1-c8b07ad5d339 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Deletion of /opt/stack/data/nova/instances/e1036e0f-683f-4dfd-b0ad-6187d90ff2f6_del complete Apr 20 16:05:40 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Acquiring lock "4030659dc9e6940e4f224066d06e3784b1229890" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:40 user nova-compute[71605]: INFO nova.compute.manager [None req-03d3aa40-47e4-46c3-93d1-c8b07ad5d339 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Took 1.34 seconds to destroy the instance on the hypervisor. Apr 20 16:05:40 user nova-compute[71605]: DEBUG oslo.service.loopingcall [None req-03d3aa40-47e4-46c3-93d1-c8b07ad5d339 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71605) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.compute.manager [-] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Deallocating network for instance {{(pid=71605) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] deallocate_for_instance() {{(pid=71605) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.139s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk.rescue {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk.rescue" returned: 0 in 0.048s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.193s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.objects.instance [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lazy-loading 'migration_context' on Instance uuid fe0bde76-a4f8-4865-91af-2bd3790587a7 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Created local disks {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Start _get_guest_xml network_info=[{"id": "9f4d2191-16c0-4ab6-a4bd-f016499a9aad", "address": "fa:16:3e:dd:52:dd", "network": {"id": "4b5db782-8dbb-4f06-8e98-a794013dbc8c", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1330432693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1330432693-network", "vif_mac": "fa:16:3e:dd:52:dd"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d4a73ba128147f295bf6a4545fede47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f4d2191-16", "ovs_interfaceid": "9f4d2191-16c0-4ab6-a4bd-f016499a9aad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=) rescue={'image_id': '4ac69ea5-e5d7-40c8-864e-0a164d78a727', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.objects.instance [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lazy-loading 'resources' on Instance uuid fe0bde76-a4f8-4865-91af-2bd3790587a7 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.objects.instance [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lazy-loading 'numa_topology' on Instance uuid fe0bde76-a4f8-4865-91af-2bd3790587a7 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/disk --force-share --output=json" returned: 0 in 0.152s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:40 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:05:40 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71605) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T16:00:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=), allow threads: True {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Flavor limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Image limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Flavor pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Image pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Got 1 possible topologies {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.objects.instance [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lazy-loading 'vcpu_model' on Instance uuid fe0bde76-a4f8-4865-91af-2bd3790587a7 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:03:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1422609663',display_name='tempest-ServerRescueNegativeTestJSON-server-1422609663',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1422609663',id=9,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-20T16:03:59Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='1d4a73ba128147f295bf6a4545fede47',ramdisk_id='',reservation_id='r-0526y6jk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerRescueNegativeTestJSON-237285916',owner_user_name='tempest-ServerRescueNegativeTestJSON-237285916-project-member'},tags=,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:03:59Z,user_data=None,user_id='e51e637e06d1475692c4055ae99121da',uuid=fe0bde76-a4f8-4865-91af-2bd3790587a7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9f4d2191-16c0-4ab6-a4bd-f016499a9aad", "address": "fa:16:3e:dd:52:dd", "network": {"id": "4b5db782-8dbb-4f06-8e98-a794013dbc8c", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1330432693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1330432693-network", "vif_mac": "fa:16:3e:dd:52:dd"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d4a73ba128147f295bf6a4545fede47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f4d2191-16", "ovs_interfaceid": "9f4d2191-16c0-4ab6-a4bd-f016499a9aad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71605) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Converting VIF {"id": "9f4d2191-16c0-4ab6-a4bd-f016499a9aad", "address": "fa:16:3e:dd:52:dd", "network": {"id": "4b5db782-8dbb-4f06-8e98-a794013dbc8c", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1330432693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-1330432693-network", "vif_mac": "fa:16:3e:dd:52:dd"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d4a73ba128147f295bf6a4545fede47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f4d2191-16", "ovs_interfaceid": "9f4d2191-16c0-4ab6-a4bd-f016499a9aad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dd:52:dd,bridge_name='br-int',has_traffic_filtering=True,id=9f4d2191-16c0-4ab6-a4bd-f016499a9aad,network=Network(4b5db782-8dbb-4f06-8e98-a794013dbc8c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f4d2191-16') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.objects.instance [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lazy-loading 'pci_devices' on Instance uuid fe0bde76-a4f8-4865-91af-2bd3790587a7 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] End _get_guest_xml xml= Apr 20 16:05:40 user nova-compute[71605]: fe0bde76-a4f8-4865-91af-2bd3790587a7 Apr 20 16:05:40 user nova-compute[71605]: instance-00000009 Apr 20 16:05:40 user nova-compute[71605]: 131072 Apr 20 16:05:40 user nova-compute[71605]: 1 Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: tempest-ServerRescueNegativeTestJSON-server-1422609663 Apr 20 16:05:40 user nova-compute[71605]: 2023-04-20 16:05:40 Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: 128 Apr 20 16:05:40 user nova-compute[71605]: 1 Apr 20 16:05:40 user nova-compute[71605]: 0 Apr 20 16:05:40 user nova-compute[71605]: 0 Apr 20 16:05:40 user nova-compute[71605]: 1 Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: tempest-ServerRescueNegativeTestJSON-237285916-project-member Apr 20 16:05:40 user nova-compute[71605]: tempest-ServerRescueNegativeTestJSON-237285916 Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: OpenStack Foundation Apr 20 16:05:40 user nova-compute[71605]: OpenStack Nova Apr 20 16:05:40 user nova-compute[71605]: 0.0.0 Apr 20 16:05:40 user nova-compute[71605]: fe0bde76-a4f8-4865-91af-2bd3790587a7 Apr 20 16:05:40 user nova-compute[71605]: fe0bde76-a4f8-4865-91af-2bd3790587a7 Apr 20 16:05:40 user nova-compute[71605]: Virtual Machine Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: hvm Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Nehalem Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: /dev/urandom Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: Apr 20 16:05:40 user nova-compute[71605]: {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:40 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:40 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Instance destroyed successfully. Apr 20 16:05:41 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] No BDM found with device name vda, not building metadata. {{(pid=71605) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] No BDM found with device name vdb, not building metadata. {{(pid=71605) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] No VIF found with MAC fa:16:3e:dd:52:dd, not building metadata {{(pid=71605) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk --force-share --output=json" returned: 0 in 0.167s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk --force-share --output=json" returned: 0 in 0.153s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json" returned: 0 in 0.184s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dc918ed4-8bc6-4a4f-a189-d6cdd5817854/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG nova.compute.manager [req-d7a27313-9144-4878-a973-a80719d2b801 req-48569e0d-9f28-4712-a557-6cada4bcbfa9 service nova] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Received event network-vif-plugged-a0d0df58-0e84-4e27-bc44-3c5983d6d23b {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-d7a27313-9144-4878-a973-a80719d2b801 req-48569e0d-9f28-4712-a557-6cada4bcbfa9 service nova] Acquiring lock "e1036e0f-683f-4dfd-b0ad-6187d90ff2f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-d7a27313-9144-4878-a973-a80719d2b801 req-48569e0d-9f28-4712-a557-6cada4bcbfa9 service nova] Lock "e1036e0f-683f-4dfd-b0ad-6187d90ff2f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-d7a27313-9144-4878-a973-a80719d2b801 req-48569e0d-9f28-4712-a557-6cada4bcbfa9 service nova] Lock "e1036e0f-683f-4dfd-b0ad-6187d90ff2f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG nova.compute.manager [req-d7a27313-9144-4878-a973-a80719d2b801 req-48569e0d-9f28-4712-a557-6cada4bcbfa9 service nova] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] No waiting events found dispatching network-vif-plugged-a0d0df58-0e84-4e27-bc44-3c5983d6d23b {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:05:41 user nova-compute[71605]: WARNING nova.compute.manager [req-d7a27313-9144-4878-a973-a80719d2b801 req-48569e0d-9f28-4712-a557-6cada4bcbfa9 service nova] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Received unexpected event network-vif-plugged-a0d0df58-0e84-4e27-bc44-3c5983d6d23b for instance with vm_state active and task_state deleting. Apr 20 16:05:41 user nova-compute[71605]: DEBUG nova.compute.manager [req-d7a27313-9144-4878-a973-a80719d2b801 req-48569e0d-9f28-4712-a557-6cada4bcbfa9 service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Received event network-vif-unplugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-d7a27313-9144-4878-a973-a80719d2b801 req-48569e0d-9f28-4712-a557-6cada4bcbfa9 service nova] Acquiring lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-d7a27313-9144-4878-a973-a80719d2b801 req-48569e0d-9f28-4712-a557-6cada4bcbfa9 service nova] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-d7a27313-9144-4878-a973-a80719d2b801 req-48569e0d-9f28-4712-a557-6cada4bcbfa9 service nova] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG nova.compute.manager [req-d7a27313-9144-4878-a973-a80719d2b801 req-48569e0d-9f28-4712-a557-6cada4bcbfa9 service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] No waiting events found dispatching network-vif-unplugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:05:41 user nova-compute[71605]: WARNING nova.compute.manager [req-d7a27313-9144-4878-a973-a80719d2b801 req-48569e0d-9f28-4712-a557-6cada4bcbfa9 service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Received unexpected event network-vif-unplugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad for instance with vm_state active and task_state rescuing. Apr 20 16:05:41 user nova-compute[71605]: DEBUG nova.compute.manager [req-d7a27313-9144-4878-a973-a80719d2b801 req-48569e0d-9f28-4712-a557-6cada4bcbfa9 service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Received event network-vif-plugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-d7a27313-9144-4878-a973-a80719d2b801 req-48569e0d-9f28-4712-a557-6cada4bcbfa9 service nova] Acquiring lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-d7a27313-9144-4878-a973-a80719d2b801 req-48569e0d-9f28-4712-a557-6cada4bcbfa9 service nova] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-d7a27313-9144-4878-a973-a80719d2b801 req-48569e0d-9f28-4712-a557-6cada4bcbfa9 service nova] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG nova.compute.manager [req-d7a27313-9144-4878-a973-a80719d2b801 req-48569e0d-9f28-4712-a557-6cada4bcbfa9 service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] No waiting events found dispatching network-vif-plugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:05:41 user nova-compute[71605]: WARNING nova.compute.manager [req-d7a27313-9144-4878-a973-a80719d2b801 req-48569e0d-9f28-4712-a557-6cada4bcbfa9 service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Received unexpected event network-vif-plugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad for instance with vm_state active and task_state rescuing. Apr 20 16:05:41 user nova-compute[71605]: DEBUG nova.compute.manager [req-d7a27313-9144-4878-a973-a80719d2b801 req-48569e0d-9f28-4712-a557-6cada4bcbfa9 service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Received event network-vif-plugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-d7a27313-9144-4878-a973-a80719d2b801 req-48569e0d-9f28-4712-a557-6cada4bcbfa9 service nova] Acquiring lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-d7a27313-9144-4878-a973-a80719d2b801 req-48569e0d-9f28-4712-a557-6cada4bcbfa9 service nova] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-d7a27313-9144-4878-a973-a80719d2b801 req-48569e0d-9f28-4712-a557-6cada4bcbfa9 service nova] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG nova.compute.manager [req-d7a27313-9144-4878-a973-a80719d2b801 req-48569e0d-9f28-4712-a557-6cada4bcbfa9 service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] No waiting events found dispatching network-vif-plugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:05:41 user nova-compute[71605]: WARNING nova.compute.manager [req-d7a27313-9144-4878-a973-a80719d2b801 req-48569e0d-9f28-4712-a557-6cada4bcbfa9 service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Received unexpected event network-vif-plugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad for instance with vm_state active and task_state rescuing. Apr 20 16:05:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dc918ed4-8bc6-4a4f-a189-d6cdd5817854/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dc918ed4-8bc6-4a4f-a189-d6cdd5817854/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquiring lock "c2b84ca2-f67b-4219-b7e6-18d2029e998a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "c2b84ca2-f67b-4219-b7e6-18d2029e998a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dc918ed4-8bc6-4a4f-a189-d6cdd5817854/disk --force-share --output=json" returned: 0 in 0.163s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dd78d74a-11d6-4f06-8092-5088b3fad412/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:41 user nova-compute[71605]: DEBUG nova.compute.manager [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Starting instance... {{(pid=71605) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 16:05:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dd78d74a-11d6-4f06-8092-5088b3fad412/disk --force-share --output=json" returned: 0 in 0.155s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dd78d74a-11d6-4f06-8092-5088b3fad412/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:42 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71605) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 16:05:42 user nova-compute[71605]: INFO nova.compute.claims [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Claim successful on node user Apr 20 16:05:42 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:05:42 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Took 1.56 seconds to deallocate network for instance. Apr 20 16:05:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dd78d74a-11d6-4f06-8092-5088b3fad412/disk --force-share --output=json" returned: 0 in 0.153s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91f4b3d1-0fea-4378-94e3-c2bbfd8cad81/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-03d3aa40-47e4-46c3-93d1-c8b07ad5d339 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91f4b3d1-0fea-4378-94e3-c2bbfd8cad81/disk --force-share --output=json" returned: 0 in 0.215s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91f4b3d1-0fea-4378-94e3-c2bbfd8cad81/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:42 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:05:42 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:05:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/91f4b3d1-0fea-4378-94e3-c2bbfd8cad81/disk --force-share --output=json" returned: 0 in 0.170s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:42 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Error from libvirt while getting description of instance-00000007: [Error Code 42] Domain not found: no domain with matching uuid 'e1036e0f-683f-4dfd-b0ad-6187d90ff2f6' (instance-00000007): libvirt.libvirtError: Domain not found: no domain with matching uuid 'e1036e0f-683f-4dfd-b0ad-6187d90ff2f6' (instance-00000007) Apr 20 16:05:43 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.052s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:43 user nova-compute[71605]: DEBUG nova.compute.manager [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Start building networks asynchronously for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 16:05:43 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-03d3aa40-47e4-46c3-93d1-c8b07ad5d339 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.816s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG nova.compute.manager [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Received event network-vif-plugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] Acquiring lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG nova.compute.manager [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] No waiting events found dispatching network-vif-plugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:05:44 user nova-compute[71605]: WARNING nova.compute.manager [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Received unexpected event network-vif-plugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad for instance with vm_state active and task_state rescuing. Apr 20 16:05:44 user nova-compute[71605]: DEBUG nova.compute.manager [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Received event network-vif-unplugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] Acquiring lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG nova.compute.manager [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] No waiting events found dispatching network-vif-unplugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:05:44 user nova-compute[71605]: WARNING nova.compute.manager [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Received unexpected event network-vif-unplugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad for instance with vm_state active and task_state rescuing. Apr 20 16:05:44 user nova-compute[71605]: DEBUG nova.compute.manager [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Received event network-vif-deleted-a0d0df58-0e84-4e27-bc44-3c5983d6d23b {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG nova.compute.manager [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Received event network-vif-plugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] Acquiring lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG nova.compute.manager [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] No waiting events found dispatching network-vif-plugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:05:44 user nova-compute[71605]: WARNING nova.compute.manager [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Received unexpected event network-vif-plugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad for instance with vm_state active and task_state rescuing. Apr 20 16:05:44 user nova-compute[71605]: DEBUG nova.compute.manager [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Received event network-vif-plugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] Acquiring lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG nova.compute.manager [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] No waiting events found dispatching network-vif-plugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:05:44 user nova-compute[71605]: WARNING nova.compute.manager [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Received unexpected event network-vif-plugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad for instance with vm_state active and task_state rescuing. Apr 20 16:05:44 user nova-compute[71605]: DEBUG nova.compute.manager [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Received event network-vif-plugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] Acquiring lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG nova.compute.manager [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] No waiting events found dispatching network-vif-plugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:05:44 user nova-compute[71605]: WARNING nova.compute.manager [req-27e68161-02a6-467e-bb68-1db3659f3ef9 req-f28e2992-0f70-44c9-8f66-1514d4ce0bcf service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Received unexpected event network-vif-plugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad for instance with vm_state active and task_state rescuing. Apr 20 16:05:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.host [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Removed pending event for fe0bde76-a4f8-4865-91af-2bd3790587a7 due to event {{(pid=71605) _event_emit_delayed /opt/stack/nova/nova/virt/libvirt/host.py:438}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Resumed> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:05:44 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] VM Resumed (Lifecycle Event) Apr 20 16:05:44 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 16:05:44 user nova-compute[71605]: DEBUG nova.compute.manager [None req-5e7cc138-b1c6-4100-ad42-f03cb4fbbcc2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG nova.compute.manager [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Allocating IP information in the background. {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG nova.network.neutron [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] allocate_for_instance() {{(pid=71605) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Acquiring lock "a760987f-1a65-4e42-8cef-73db9ef2db48" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "a760987f-1a65-4e42-8cef-73db9ef2db48" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG nova.compute.manager [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Start building block device mappings for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 16:05:44 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:05:44 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:05:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=8385MB free_disk=26.26178741455078GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:44 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:05:45 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] During sync_power_state the instance has a pending task (rescuing). Skip. Apr 20 16:05:45 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Started> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:05:45 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] VM Started (Lifecycle Event) Apr 20 16:05:45 user nova-compute[71605]: DEBUG nova.compute.manager [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Starting instance... {{(pid=71605) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG nova.compute.manager [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Start spawning the instance on the hypervisor. {{(pid=71605) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Creating instance directory {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 16:05:45 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Creating image(s) Apr 20 16:05:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquiring lock "/opt/stack/data/nova/instances/c2b84ca2-f67b-4219-b7e6-18d2029e998a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "/opt/stack/data/nova/instances/c2b84ca2-f67b-4219-b7e6-18d2029e998a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "/opt/stack/data/nova/instances/c2b84ca2-f67b-4219-b7e6-18d2029e998a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.012s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG nova.policy [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '690c49feae904687826fb959ba5ba283', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '71cf2664111f45788d24092e8ceede9c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71605) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.148s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquiring lock "4030659dc9e6940e4f224066d06e3784b1229890" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-03d3aa40-47e4-46c3-93d1-c8b07ad5d339 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-03d3aa40-47e4-46c3-93d1-c8b07ad5d339 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.138s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/c2b84ca2-f67b-4219-b7e6-18d2029e998a/disk 1073741824 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-03d3aa40-47e4-46c3-93d1-c8b07ad5d339 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 2.339s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.504s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/c2b84ca2-f67b-4219-b7e6-18d2029e998a/disk 1073741824" returned: 0 in 0.058s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.201s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:45 user nova-compute[71605]: INFO nova.scheduler.client.report [None req-03d3aa40-47e4-46c3-93d1-c8b07ad5d339 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Deleted allocations for instance e1036e0f-683f-4dfd-b0ad-6187d90ff2f6 Apr 20 16:05:45 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.189s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Checking if we can resize image /opt/stack/data/nova/instances/c2b84ca2-f67b-4219-b7e6-18d2029e998a/disk. size=1073741824 {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c2b84ca2-f67b-4219-b7e6-18d2029e998a/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-03d3aa40-47e4-46c3-93d1-c8b07ad5d339 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "e1036e0f-683f-4dfd-b0ad-6187d90ff2f6" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 6.433s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance d4ea4d29-b178-4da2-b971-76f97031b244 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance e8f62d46-e2dc-4870-adf1-f62d88bb653b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance fe0bde76-a4f8-4865-91af-2bd3790587a7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance dc918ed4-8bc6-4a4f-a189-d6cdd5817854 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance dd78d74a-11d6-4f06-8092-5088b3fad412 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance c2b84ca2-f67b-4219-b7e6-18d2029e998a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance a760987f-1a65-4e42-8cef-73db9ef2db48 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1689}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 7 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=1408MB phys_disk=40GB used_disk=7GB total_vcpus=12 used_vcpus=7 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c2b84ca2-f67b-4219-b7e6-18d2029e998a/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Cannot resize image /opt/stack/data/nova/instances/c2b84ca2-f67b-4219-b7e6-18d2029e998a/disk to a smaller size. {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG nova.objects.instance [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lazy-loading 'migration_context' on Instance uuid c2b84ca2-f67b-4219-b7e6-18d2029e998a {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Created local disks {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Ensure instance console log exists: /opt/stack/data/nova/instances/c2b84ca2-f67b-4219-b7e6-18d2029e998a/console.log {{(pid=71605) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:46 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:05:46 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:05:46 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:05:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.577s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.881s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:46 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71605) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 16:05:46 user nova-compute[71605]: INFO nova.compute.claims [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Claim successful on node user Apr 20 16:05:46 user nova-compute[71605]: DEBUG nova.compute.manager [req-483eca30-c1f7-437c-a143-bfd493ec573e req-4750b284-37f9-4a6b-8285-914f2fce4685 service nova] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Received event network-changed-74703b46-6b03-4752-953b-9c64a63249c8 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:05:46 user nova-compute[71605]: DEBUG nova.compute.manager [req-483eca30-c1f7-437c-a143-bfd493ec573e req-4750b284-37f9-4a6b-8285-914f2fce4685 service nova] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Refreshing instance network info cache due to event network-changed-74703b46-6b03-4752-953b-9c64a63249c8. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:05:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-483eca30-c1f7-437c-a143-bfd493ec573e req-4750b284-37f9-4a6b-8285-914f2fce4685 service nova] Acquiring lock "refresh_cache-dc918ed4-8bc6-4a4f-a189-d6cdd5817854" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:05:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-483eca30-c1f7-437c-a143-bfd493ec573e req-4750b284-37f9-4a6b-8285-914f2fce4685 service nova] Acquired lock "refresh_cache-dc918ed4-8bc6-4a4f-a189-d6cdd5817854" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:05:46 user nova-compute[71605]: DEBUG nova.network.neutron [req-483eca30-c1f7-437c-a143-bfd493ec573e req-4750b284-37f9-4a6b-8285-914f2fce4685 service nova] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Refreshing network info cache for port 74703b46-6b03-4752-953b-9c64a63249c8 {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:05:46 user nova-compute[71605]: DEBUG nova.network.neutron [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Successfully created port: 9e814c79-86f6-46ce-9473-d87fb7e67641 {{(pid=71605) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 16:05:46 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:05:46 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:05:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.491s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:46 user nova-compute[71605]: DEBUG nova.compute.manager [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Start building networks asynchronously for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 16:05:46 user nova-compute[71605]: DEBUG nova.compute.manager [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Allocating IP information in the background. {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 16:05:46 user nova-compute[71605]: DEBUG nova.network.neutron [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] allocate_for_instance() {{(pid=71605) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 16:05:46 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 16:05:46 user nova-compute[71605]: DEBUG nova.compute.manager [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Start building block device mappings for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 16:05:46 user nova-compute[71605]: DEBUG nova.compute.manager [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Start spawning the instance on the hypervisor. {{(pid=71605) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 16:05:46 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Creating instance directory {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 16:05:46 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Creating image(s) Apr 20 16:05:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Acquiring lock "/opt/stack/data/nova/instances/a760987f-1a65-4e42-8cef-73db9ef2db48/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "/opt/stack/data/nova/instances/a760987f-1a65-4e42-8cef-73db9ef2db48/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "/opt/stack/data/nova/instances/a760987f-1a65-4e42-8cef-73db9ef2db48/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:46 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:46 user nova-compute[71605]: DEBUG nova.policy [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f50dbce30f294bb0ba6bc2811025835d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cb0a5eb3796a4d3a871843f409c6ffbd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71605) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 16:05:46 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.145s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Acquiring lock "4030659dc9e6940e4f224066d06e3784b1229890" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:46 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:47 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:05:47 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:05:47 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:05:47 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.142s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:47 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/a760987f-1a65-4e42-8cef-73db9ef2db48/disk 1073741824 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:47 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/a760987f-1a65-4e42-8cef-73db9ef2db48/disk 1073741824" returned: 0 in 0.052s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.199s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:47 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "refresh_cache-91f4b3d1-0fea-4378-94e3-c2bbfd8cad81" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:05:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquired lock "refresh_cache-91f4b3d1-0fea-4378-94e3-c2bbfd8cad81" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:05:47 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Forcefully refreshing network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 16:05:47 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.141s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:47 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Checking if we can resize image /opt/stack/data/nova/instances/a760987f-1a65-4e42-8cef-73db9ef2db48/disk. size=1073741824 {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 16:05:47 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a760987f-1a65-4e42-8cef-73db9ef2db48/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:47 user nova-compute[71605]: DEBUG nova.network.neutron [req-483eca30-c1f7-437c-a143-bfd493ec573e req-4750b284-37f9-4a6b-8285-914f2fce4685 service nova] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Updated VIF entry in instance network info cache for port 74703b46-6b03-4752-953b-9c64a63249c8. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:05:47 user nova-compute[71605]: DEBUG nova.network.neutron [req-483eca30-c1f7-437c-a143-bfd493ec573e req-4750b284-37f9-4a6b-8285-914f2fce4685 service nova] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Updating instance_info_cache with network_info: [{"id": "74703b46-6b03-4752-953b-9c64a63249c8", "address": "fa:16:3e:c5:94:d0", "network": {"id": "40132b20-6bfd-4f5a-8f6f-75769961d157", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-683065417-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.7", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a92cea9e1182477ca669c506b42eda60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap74703b46-6b", "ovs_interfaceid": "74703b46-6b03-4752-953b-9c64a63249c8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:05:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-483eca30-c1f7-437c-a143-bfd493ec573e req-4750b284-37f9-4a6b-8285-914f2fce4685 service nova] Releasing lock "refresh_cache-dc918ed4-8bc6-4a4f-a189-d6cdd5817854" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:05:47 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a760987f-1a65-4e42-8cef-73db9ef2db48/disk --force-share --output=json" returned: 0 in 0.255s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:47 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Cannot resize image /opt/stack/data/nova/instances/a760987f-1a65-4e42-8cef-73db9ef2db48/disk to a smaller size. {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 16:05:47 user nova-compute[71605]: DEBUG nova.objects.instance [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lazy-loading 'migration_context' on Instance uuid a760987f-1a65-4e42-8cef-73db9ef2db48 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:05:47 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Created local disks {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 16:05:47 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Ensure instance console log exists: /opt/stack/data/nova/instances/a760987f-1a65-4e42-8cef-73db9ef2db48/console.log {{(pid=71605) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 16:05:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:48 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:49 user nova-compute[71605]: DEBUG nova.network.neutron [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Successfully updated port: 9e814c79-86f6-46ce-9473-d87fb7e67641 {{(pid=71605) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 16:05:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquiring lock "refresh_cache-c2b84ca2-f67b-4219-b7e6-18d2029e998a" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:05:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquired lock "refresh_cache-c2b84ca2-f67b-4219-b7e6-18d2029e998a" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:05:49 user nova-compute[71605]: DEBUG nova.network.neutron [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Building network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 16:05:49 user nova-compute[71605]: DEBUG nova.compute.manager [req-8caaa124-e0e7-49ac-98d3-42533cad44e2 req-cff0caed-62d3-401a-9b02-5c272a880972 service nova] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Received event network-changed-9e814c79-86f6-46ce-9473-d87fb7e67641 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:05:49 user nova-compute[71605]: DEBUG nova.compute.manager [req-8caaa124-e0e7-49ac-98d3-42533cad44e2 req-cff0caed-62d3-401a-9b02-5c272a880972 service nova] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Refreshing instance network info cache due to event network-changed-9e814c79-86f6-46ce-9473-d87fb7e67641. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:05:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-8caaa124-e0e7-49ac-98d3-42533cad44e2 req-cff0caed-62d3-401a-9b02-5c272a880972 service nova] Acquiring lock "refresh_cache-c2b84ca2-f67b-4219-b7e6-18d2029e998a" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:05:49 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:49 user nova-compute[71605]: DEBUG nova.network.neutron [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Instance cache missing network info. {{(pid=71605) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 16:05:49 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Updating instance_info_cache with network_info: [{"id": "b2af67f0-0768-4ebc-a21b-0ef6e2b3f264", "address": "fa:16:3e:d0:3f:7b", "network": {"id": "224391e3-9d6f-4e5f-b1bb-00dd1cd0ea06", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1568684394-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.131", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fbbcfeb5266f4ca6b9738b18ba7d127e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2af67f0-07", "ovs_interfaceid": "b2af67f0-0768-4ebc-a21b-0ef6e2b3f264", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:05:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Releasing lock "refresh_cache-91f4b3d1-0fea-4378-94e3-c2bbfd8cad81" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:05:49 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Updated the network info_cache for instance {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 16:05:49 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:05:49 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:05:49 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:05:49 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:05:49 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:05:49 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:05:49 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:05:49 user nova-compute[71605]: DEBUG nova.network.neutron [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Successfully created port: cac4dfaa-510a-4330-b9b1-aeb25f57abef {{(pid=71605) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 16:05:49 user nova-compute[71605]: DEBUG nova.network.neutron [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Updating instance_info_cache with network_info: [{"id": "9e814c79-86f6-46ce-9473-d87fb7e67641", "address": "fa:16:3e:24:4e:12", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e814c79-86", "ovs_interfaceid": "9e814c79-86f6-46ce-9473-d87fb7e67641", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:05:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Releasing lock "refresh_cache-c2b84ca2-f67b-4219-b7e6-18d2029e998a" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:05:49 user nova-compute[71605]: DEBUG nova.compute.manager [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Instance network_info: |[{"id": "9e814c79-86f6-46ce-9473-d87fb7e67641", "address": "fa:16:3e:24:4e:12", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e814c79-86", "ovs_interfaceid": "9e814c79-86f6-46ce-9473-d87fb7e67641", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 16:05:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-8caaa124-e0e7-49ac-98d3-42533cad44e2 req-cff0caed-62d3-401a-9b02-5c272a880972 service nova] Acquired lock "refresh_cache-c2b84ca2-f67b-4219-b7e6-18d2029e998a" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:05:49 user nova-compute[71605]: DEBUG nova.network.neutron [req-8caaa124-e0e7-49ac-98d3-42533cad44e2 req-cff0caed-62d3-401a-9b02-5c272a880972 service nova] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Refreshing network info cache for port 9e814c79-86f6-46ce-9473-d87fb7e67641 {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:05:49 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Start _get_guest_xml network_info=[{"id": "9e814c79-86f6-46ce-9473-d87fb7e67641", "address": "fa:16:3e:24:4e:12", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e814c79-86", "ovs_interfaceid": "9e814c79-86f6-46ce-9473-d87fb7e67641", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'size': 0, 'encryption_format': None, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'image_id': '4ac69ea5-e5d7-40c8-864e-0a164d78a727'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 16:05:50 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:05:50 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71605) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T16:00:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=), allow threads: True {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Flavor limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Image limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Flavor pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Image pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Got 1 possible topologies {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:05:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1498143817',display_name='tempest-AttachVolumeNegativeTest-server-1498143817',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1498143817',id=12,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM/e6+xMI0YH6Nw89h/OWSpnukYpq8WmC7TXM/G8CwHs84ixak8UdfgaeBRkeLKS6hBuTod5w5YIWjrhnSQwR7L2FaQ72Z5mCu+hRUU2g4pFa5raukmqUiXrVuyvOpMkNQ==',key_name='tempest-keypair-1502137074',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71cf2664111f45788d24092e8ceede9c',ramdisk_id='',reservation_id='r-ev0yb61c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-308436039',owner_user_name='tempest-AttachVolumeNegativeTest-308436039-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:05:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='690c49feae904687826fb959ba5ba283',uuid=c2b84ca2-f67b-4219-b7e6-18d2029e998a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9e814c79-86f6-46ce-9473-d87fb7e67641", "address": "fa:16:3e:24:4e:12", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e814c79-86", "ovs_interfaceid": "9e814c79-86f6-46ce-9473-d87fb7e67641", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71605) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Converting VIF {"id": "9e814c79-86f6-46ce-9473-d87fb7e67641", "address": "fa:16:3e:24:4e:12", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e814c79-86", "ovs_interfaceid": "9e814c79-86f6-46ce-9473-d87fb7e67641", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:4e:12,bridge_name='br-int',has_traffic_filtering=True,id=9e814c79-86f6-46ce-9473-d87fb7e67641,network=Network(2dc9b3da-0124-4718-9f70-a131cd030480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e814c79-86') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.objects.instance [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lazy-loading 'pci_devices' on Instance uuid c2b84ca2-f67b-4219-b7e6-18d2029e998a {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] End _get_guest_xml xml= Apr 20 16:05:50 user nova-compute[71605]: c2b84ca2-f67b-4219-b7e6-18d2029e998a Apr 20 16:05:50 user nova-compute[71605]: instance-0000000c Apr 20 16:05:50 user nova-compute[71605]: 131072 Apr 20 16:05:50 user nova-compute[71605]: 1 Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: tempest-AttachVolumeNegativeTest-server-1498143817 Apr 20 16:05:50 user nova-compute[71605]: 2023-04-20 16:05:50 Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: 128 Apr 20 16:05:50 user nova-compute[71605]: 1 Apr 20 16:05:50 user nova-compute[71605]: 0 Apr 20 16:05:50 user nova-compute[71605]: 0 Apr 20 16:05:50 user nova-compute[71605]: 1 Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: tempest-AttachVolumeNegativeTest-308436039-project-member Apr 20 16:05:50 user nova-compute[71605]: tempest-AttachVolumeNegativeTest-308436039 Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: OpenStack Foundation Apr 20 16:05:50 user nova-compute[71605]: OpenStack Nova Apr 20 16:05:50 user nova-compute[71605]: 0.0.0 Apr 20 16:05:50 user nova-compute[71605]: c2b84ca2-f67b-4219-b7e6-18d2029e998a Apr 20 16:05:50 user nova-compute[71605]: c2b84ca2-f67b-4219-b7e6-18d2029e998a Apr 20 16:05:50 user nova-compute[71605]: Virtual Machine Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: hvm Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Nehalem Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: /dev/urandom Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: Apr 20 16:05:50 user nova-compute[71605]: {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:05:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1498143817',display_name='tempest-AttachVolumeNegativeTest-server-1498143817',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1498143817',id=12,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM/e6+xMI0YH6Nw89h/OWSpnukYpq8WmC7TXM/G8CwHs84ixak8UdfgaeBRkeLKS6hBuTod5w5YIWjrhnSQwR7L2FaQ72Z5mCu+hRUU2g4pFa5raukmqUiXrVuyvOpMkNQ==',key_name='tempest-keypair-1502137074',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71cf2664111f45788d24092e8ceede9c',ramdisk_id='',reservation_id='r-ev0yb61c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-308436039',owner_user_name='tempest-AttachVolumeNegativeTest-308436039-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:05:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='690c49feae904687826fb959ba5ba283',uuid=c2b84ca2-f67b-4219-b7e6-18d2029e998a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9e814c79-86f6-46ce-9473-d87fb7e67641", "address": "fa:16:3e:24:4e:12", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e814c79-86", "ovs_interfaceid": "9e814c79-86f6-46ce-9473-d87fb7e67641", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Converting VIF {"id": "9e814c79-86f6-46ce-9473-d87fb7e67641", "address": "fa:16:3e:24:4e:12", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e814c79-86", "ovs_interfaceid": "9e814c79-86f6-46ce-9473-d87fb7e67641", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:4e:12,bridge_name='br-int',has_traffic_filtering=True,id=9e814c79-86f6-46ce-9473-d87fb7e67641,network=Network(2dc9b3da-0124-4718-9f70-a131cd030480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e814c79-86') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG os_vif [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:4e:12,bridge_name='br-int',has_traffic_filtering=True,id=9e814c79-86f6-46ce-9473-d87fb7e67641,network=Network(2dc9b3da-0124-4718-9f70-a131cd030480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e814c79-86') {{(pid=71605) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9e814c79-86, may_exist=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9e814c79-86, col_values=(('external_ids', {'iface-id': '9e814c79-86f6-46ce-9473-d87fb7e67641', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:24:4e:12', 'vm-uuid': 'c2b84ca2-f67b-4219-b7e6-18d2029e998a'}),)) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:50 user nova-compute[71605]: INFO os_vif [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:4e:12,bridge_name='br-int',has_traffic_filtering=True,id=9e814c79-86f6-46ce-9473-d87fb7e67641,network=Network(2dc9b3da-0124-4718-9f70-a131cd030480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e814c79-86') Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] No BDM found with device name vda, not building metadata. {{(pid=71605) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] No VIF found with MAC fa:16:3e:24:4e:12, not building metadata {{(pid=71605) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Acquiring lock "a145fb51-4ca5-4cc4-b8bd-cd3665bef473" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "a145fb51-4ca5-4cc4-b8bd-cd3665bef473" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.compute.manager [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Starting instance... {{(pid=71605) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71605) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 16:05:50 user nova-compute[71605]: INFO nova.compute.claims [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Claim successful on node user Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.network.neutron [req-8caaa124-e0e7-49ac-98d3-42533cad44e2 req-cff0caed-62d3-401a-9b02-5c272a880972 service nova] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Updated VIF entry in instance network info cache for port 9e814c79-86f6-46ce-9473-d87fb7e67641. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.network.neutron [req-8caaa124-e0e7-49ac-98d3-42533cad44e2 req-cff0caed-62d3-401a-9b02-5c272a880972 service nova] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Updating instance_info_cache with network_info: [{"id": "9e814c79-86f6-46ce-9473-d87fb7e67641", "address": "fa:16:3e:24:4e:12", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e814c79-86", "ovs_interfaceid": "9e814c79-86f6-46ce-9473-d87fb7e67641", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-8caaa124-e0e7-49ac-98d3-42533cad44e2 req-cff0caed-62d3-401a-9b02-5c272a880972 service nova] Releasing lock "refresh_cache-c2b84ca2-f67b-4219-b7e6-18d2029e998a" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.545s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.compute.manager [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Start building networks asynchronously for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.compute.manager [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Allocating IP information in the background. {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.network.neutron [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] allocate_for_instance() {{(pid=71605) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 16:05:50 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 16:05:50 user nova-compute[71605]: DEBUG nova.compute.manager [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Start building block device mappings for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Acquiring lock "15d42ba7-cf47-4374-83b5-06d5242951b7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Lock "15d42ba7-cf47-4374-83b5-06d5242951b7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG nova.network.neutron [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Successfully updated port: cac4dfaa-510a-4330-b9b1-aeb25f57abef {{(pid=71605) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG nova.policy [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c92692a1d38b4531a4e7f42660a54c7b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a92cea9e1182477ca669c506b42eda60', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71605) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG nova.compute.manager [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Start spawning the instance on the hypervisor. {{(pid=71605) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Creating instance directory {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 16:05:51 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Creating image(s) Apr 20 16:05:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Acquiring lock "/opt/stack/data/nova/instances/a145fb51-4ca5-4cc4-b8bd-cd3665bef473/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "/opt/stack/data/nova/instances/a145fb51-4ca5-4cc4-b8bd-cd3665bef473/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "/opt/stack/data/nova/instances/a145fb51-4ca5-4cc4-b8bd-cd3665bef473/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG nova.compute.manager [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Starting instance... {{(pid=71605) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Acquiring lock "refresh_cache-a760987f-1a65-4e42-8cef-73db9ef2db48" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Acquired lock "refresh_cache-a760987f-1a65-4e42-8cef-73db9ef2db48" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG nova.network.neutron [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Building network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG nova.compute.manager [req-37ffeb6b-a261-454a-84c4-e50909a5c4a8 req-aec5403c-a664-4b53-a164-def50481aa27 service nova] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Received event network-changed-cac4dfaa-510a-4330-b9b1-aeb25f57abef {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG nova.compute.manager [req-37ffeb6b-a261-454a-84c4-e50909a5c4a8 req-aec5403c-a664-4b53-a164-def50481aa27 service nova] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Refreshing instance network info cache due to event network-changed-cac4dfaa-510a-4330-b9b1-aeb25f57abef. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-37ffeb6b-a261-454a-84c4-e50909a5c4a8 req-aec5403c-a664-4b53-a164-def50481aa27 service nova] Acquiring lock "refresh_cache-a760987f-1a65-4e42-8cef-73db9ef2db48" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71605) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 16:05:51 user nova-compute[71605]: INFO nova.compute.claims [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Claim successful on node user Apr 20 16:05:51 user nova-compute[71605]: DEBUG nova.network.neutron [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Instance cache missing network info. {{(pid=71605) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.154s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Acquiring lock "4030659dc9e6940e4f224066d06e3784b1229890" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.220s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/a145fb51-4ca5-4cc4-b8bd-cd3665bef473/disk 1073741824 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/a145fb51-4ca5-4cc4-b8bd-cd3665bef473/disk 1073741824" returned: 0 in 0.076s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.301s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.187s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Checking if we can resize image /opt/stack/data/nova/instances/a145fb51-4ca5-4cc4-b8bd-cd3665bef473/disk. size=1073741824 {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a145fb51-4ca5-4cc4-b8bd-cd3665bef473/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a145fb51-4ca5-4cc4-b8bd-cd3665bef473/disk --force-share --output=json" returned: 0 in 0.163s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Cannot resize image /opt/stack/data/nova/instances/a145fb51-4ca5-4cc4-b8bd-cd3665bef473/disk to a smaller size. {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 16:05:51 user nova-compute[71605]: DEBUG nova.objects.instance [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lazy-loading 'migration_context' on Instance uuid a145fb51-4ca5-4cc4-b8bd-cd3665bef473 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Created local disks {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Ensure instance console log exists: /opt/stack/data/nova/instances/a145fb51-4ca5-4cc4-b8bd-cd3665bef473/console.log {{(pid=71605) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.898s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.compute.manager [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Start building networks asynchronously for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.compute.manager [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Allocating IP information in the background. {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.network.neutron [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] allocate_for_instance() {{(pid=71605) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:52 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.compute.manager [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Start building block device mappings for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.network.neutron [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Updating instance_info_cache with network_info: [{"id": "cac4dfaa-510a-4330-b9b1-aeb25f57abef", "address": "fa:16:3e:97:a1:b9", "network": {"id": "545a57d8-9d55-4ace-a0ad-635d7bc0ae52", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1085059550-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0a5eb3796a4d3a871843f409c6ffbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcac4dfaa-51", "ovs_interfaceid": "cac4dfaa-510a-4330-b9b1-aeb25f57abef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Releasing lock "refresh_cache-a760987f-1a65-4e42-8cef-73db9ef2db48" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.compute.manager [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Instance network_info: |[{"id": "cac4dfaa-510a-4330-b9b1-aeb25f57abef", "address": "fa:16:3e:97:a1:b9", "network": {"id": "545a57d8-9d55-4ace-a0ad-635d7bc0ae52", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1085059550-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0a5eb3796a4d3a871843f409c6ffbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcac4dfaa-51", "ovs_interfaceid": "cac4dfaa-510a-4330-b9b1-aeb25f57abef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-37ffeb6b-a261-454a-84c4-e50909a5c4a8 req-aec5403c-a664-4b53-a164-def50481aa27 service nova] Acquired lock "refresh_cache-a760987f-1a65-4e42-8cef-73db9ef2db48" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.network.neutron [req-37ffeb6b-a261-454a-84c4-e50909a5c4a8 req-aec5403c-a664-4b53-a164-def50481aa27 service nova] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Refreshing network info cache for port cac4dfaa-510a-4330-b9b1-aeb25f57abef {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Start _get_guest_xml network_info=[{"id": "cac4dfaa-510a-4330-b9b1-aeb25f57abef", "address": "fa:16:3e:97:a1:b9", "network": {"id": "545a57d8-9d55-4ace-a0ad-635d7bc0ae52", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1085059550-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0a5eb3796a4d3a871843f409c6ffbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcac4dfaa-51", "ovs_interfaceid": "cac4dfaa-510a-4330-b9b1-aeb25f57abef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'size': 0, 'encryption_format': None, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'image_id': '4ac69ea5-e5d7-40c8-864e-0a164d78a727'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 16:05:52 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:05:52 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71605) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T16:00:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=), allow threads: True {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Flavor limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Image limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Flavor pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Image pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Got 1 possible topologies {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:05:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-924389841',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-924389841',id=13,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDVMEOOD2DPBjhKcYiA5lmZjVYxh9PWLGO75MzhXO3aLsn0kBvkh5hqWzAscvsUYLQELbD8L/orvsrJrdTwkd7/EBmpsdlVzjqkj4vcLr/kYQYhKCohu26BkQL4kIIGz1A==',key_name='tempest-keypair-297016939',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cb0a5eb3796a4d3a871843f409c6ffbd',ramdisk_id='',reservation_id='r-xxotsh8t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1118127371',owner_user_name='tempest-AttachVolumeShelveTestJSON-1118127371-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:05:47Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='f50dbce30f294bb0ba6bc2811025835d',uuid=a760987f-1a65-4e42-8cef-73db9ef2db48,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cac4dfaa-510a-4330-b9b1-aeb25f57abef", "address": "fa:16:3e:97:a1:b9", "network": {"id": "545a57d8-9d55-4ace-a0ad-635d7bc0ae52", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1085059550-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0a5eb3796a4d3a871843f409c6ffbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcac4dfaa-51", "ovs_interfaceid": "cac4dfaa-510a-4330-b9b1-aeb25f57abef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71605) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Converting VIF {"id": "cac4dfaa-510a-4330-b9b1-aeb25f57abef", "address": "fa:16:3e:97:a1:b9", "network": {"id": "545a57d8-9d55-4ace-a0ad-635d7bc0ae52", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1085059550-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0a5eb3796a4d3a871843f409c6ffbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcac4dfaa-51", "ovs_interfaceid": "cac4dfaa-510a-4330-b9b1-aeb25f57abef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:a1:b9,bridge_name='br-int',has_traffic_filtering=True,id=cac4dfaa-510a-4330-b9b1-aeb25f57abef,network=Network(545a57d8-9d55-4ace-a0ad-635d7bc0ae52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcac4dfaa-51') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.objects.instance [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lazy-loading 'pci_devices' on Instance uuid a760987f-1a65-4e42-8cef-73db9ef2db48 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.compute.manager [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Start spawning the instance on the hypervisor. {{(pid=71605) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Creating instance directory {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 16:05:52 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Creating image(s) Apr 20 16:05:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Acquiring lock "/opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Lock "/opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Lock "/opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] End _get_guest_xml xml= Apr 20 16:05:52 user nova-compute[71605]: a760987f-1a65-4e42-8cef-73db9ef2db48 Apr 20 16:05:52 user nova-compute[71605]: instance-0000000d Apr 20 16:05:52 user nova-compute[71605]: 131072 Apr 20 16:05:52 user nova-compute[71605]: 1 Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: tempest-AttachVolumeShelveTestJSON-server-924389841 Apr 20 16:05:52 user nova-compute[71605]: 2023-04-20 16:05:52 Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: 128 Apr 20 16:05:52 user nova-compute[71605]: 1 Apr 20 16:05:52 user nova-compute[71605]: 0 Apr 20 16:05:52 user nova-compute[71605]: 0 Apr 20 16:05:52 user nova-compute[71605]: 1 Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: tempest-AttachVolumeShelveTestJSON-1118127371-project-member Apr 20 16:05:52 user nova-compute[71605]: tempest-AttachVolumeShelveTestJSON-1118127371 Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: OpenStack Foundation Apr 20 16:05:52 user nova-compute[71605]: OpenStack Nova Apr 20 16:05:52 user nova-compute[71605]: 0.0.0 Apr 20 16:05:52 user nova-compute[71605]: a760987f-1a65-4e42-8cef-73db9ef2db48 Apr 20 16:05:52 user nova-compute[71605]: a760987f-1a65-4e42-8cef-73db9ef2db48 Apr 20 16:05:52 user nova-compute[71605]: Virtual Machine Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: hvm Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Nehalem Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: /dev/urandom Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: Apr 20 16:05:52 user nova-compute[71605]: {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:05:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-924389841',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-924389841',id=13,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDVMEOOD2DPBjhKcYiA5lmZjVYxh9PWLGO75MzhXO3aLsn0kBvkh5hqWzAscvsUYLQELbD8L/orvsrJrdTwkd7/EBmpsdlVzjqkj4vcLr/kYQYhKCohu26BkQL4kIIGz1A==',key_name='tempest-keypair-297016939',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='cb0a5eb3796a4d3a871843f409c6ffbd',ramdisk_id='',reservation_id='r-xxotsh8t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1118127371',owner_user_name='tempest-AttachVolumeShelveTestJSON-1118127371-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:05:47Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='f50dbce30f294bb0ba6bc2811025835d',uuid=a760987f-1a65-4e42-8cef-73db9ef2db48,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cac4dfaa-510a-4330-b9b1-aeb25f57abef", "address": "fa:16:3e:97:a1:b9", "network": {"id": "545a57d8-9d55-4ace-a0ad-635d7bc0ae52", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1085059550-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0a5eb3796a4d3a871843f409c6ffbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcac4dfaa-51", "ovs_interfaceid": "cac4dfaa-510a-4330-b9b1-aeb25f57abef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Converting VIF {"id": "cac4dfaa-510a-4330-b9b1-aeb25f57abef", "address": "fa:16:3e:97:a1:b9", "network": {"id": "545a57d8-9d55-4ace-a0ad-635d7bc0ae52", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1085059550-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0a5eb3796a4d3a871843f409c6ffbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcac4dfaa-51", "ovs_interfaceid": "cac4dfaa-510a-4330-b9b1-aeb25f57abef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:a1:b9,bridge_name='br-int',has_traffic_filtering=True,id=cac4dfaa-510a-4330-b9b1-aeb25f57abef,network=Network(545a57d8-9d55-4ace-a0ad-635d7bc0ae52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcac4dfaa-51') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG os_vif [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:a1:b9,bridge_name='br-int',has_traffic_filtering=True,id=cac4dfaa-510a-4330-b9b1-aeb25f57abef,network=Network(545a57d8-9d55-4ace-a0ad-635d7bc0ae52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcac4dfaa-51') {{(pid=71605) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.policy [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dd6dee2194d04f45a81fd0ef45ca0632', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fbd2a72dddad4f2892243a33df4fa2d1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71605) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcac4dfaa-51, may_exist=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcac4dfaa-51, col_values=(('external_ids', {'iface-id': 'cac4dfaa-510a-4330-b9b1-aeb25f57abef', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:97:a1:b9', 'vm-uuid': 'a760987f-1a65-4e42-8cef-73db9ef2db48'}),)) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:52 user nova-compute[71605]: INFO os_vif [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:a1:b9,bridge_name='br-int',has_traffic_filtering=True,id=cac4dfaa-510a-4330-b9b1-aeb25f57abef,network=Network(545a57d8-9d55-4ace-a0ad-635d7bc0ae52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcac4dfaa-51') Apr 20 16:05:52 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.131s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Acquiring lock "4030659dc9e6940e4f224066d06e3784b1229890" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] No BDM found with device name vda, not building metadata. {{(pid=71605) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] No VIF found with MAC fa:16:3e:97:a1:b9, not building metadata {{(pid=71605) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.129s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk 1073741824 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk 1073741824" returned: 0 in 0.071s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.206s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.133s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Checking if we can resize image /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk. size=1073741824 {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 16:05:52 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json" returned: 0 in 0.178s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Cannot resize image /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk to a smaller size. {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG nova.objects.instance [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Lazy-loading 'migration_context' on Instance uuid 15d42ba7-cf47-4374-83b5-06d5242951b7 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Created local disks {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Ensure instance console log exists: /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/console.log {{(pid=71605) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG nova.network.neutron [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Successfully created port: 989ee5cd-ff10-4bcc-9b11-017b23299187 {{(pid=71605) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG nova.compute.manager [req-281cf31d-7082-480f-8c1c-ba31cb113077 req-febb4c65-8c16-4355-a0c7-c793f91f13b6 service nova] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Received event network-vif-plugged-9e814c79-86f6-46ce-9473-d87fb7e67641 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-281cf31d-7082-480f-8c1c-ba31cb113077 req-febb4c65-8c16-4355-a0c7-c793f91f13b6 service nova] Acquiring lock "c2b84ca2-f67b-4219-b7e6-18d2029e998a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-281cf31d-7082-480f-8c1c-ba31cb113077 req-febb4c65-8c16-4355-a0c7-c793f91f13b6 service nova] Lock "c2b84ca2-f67b-4219-b7e6-18d2029e998a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-281cf31d-7082-480f-8c1c-ba31cb113077 req-febb4c65-8c16-4355-a0c7-c793f91f13b6 service nova] Lock "c2b84ca2-f67b-4219-b7e6-18d2029e998a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG nova.compute.manager [req-281cf31d-7082-480f-8c1c-ba31cb113077 req-febb4c65-8c16-4355-a0c7-c793f91f13b6 service nova] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] No waiting events found dispatching network-vif-plugged-9e814c79-86f6-46ce-9473-d87fb7e67641 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:05:53 user nova-compute[71605]: WARNING nova.compute.manager [req-281cf31d-7082-480f-8c1c-ba31cb113077 req-febb4c65-8c16-4355-a0c7-c793f91f13b6 service nova] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Received unexpected event network-vif-plugged-9e814c79-86f6-46ce-9473-d87fb7e67641 for instance with vm_state building and task_state spawning. Apr 20 16:05:53 user nova-compute[71605]: DEBUG nova.compute.manager [req-281cf31d-7082-480f-8c1c-ba31cb113077 req-febb4c65-8c16-4355-a0c7-c793f91f13b6 service nova] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Received event network-vif-plugged-9e814c79-86f6-46ce-9473-d87fb7e67641 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-281cf31d-7082-480f-8c1c-ba31cb113077 req-febb4c65-8c16-4355-a0c7-c793f91f13b6 service nova] Acquiring lock "c2b84ca2-f67b-4219-b7e6-18d2029e998a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-281cf31d-7082-480f-8c1c-ba31cb113077 req-febb4c65-8c16-4355-a0c7-c793f91f13b6 service nova] Lock "c2b84ca2-f67b-4219-b7e6-18d2029e998a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-281cf31d-7082-480f-8c1c-ba31cb113077 req-febb4c65-8c16-4355-a0c7-c793f91f13b6 service nova] Lock "c2b84ca2-f67b-4219-b7e6-18d2029e998a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG nova.compute.manager [req-281cf31d-7082-480f-8c1c-ba31cb113077 req-febb4c65-8c16-4355-a0c7-c793f91f13b6 service nova] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] No waiting events found dispatching network-vif-plugged-9e814c79-86f6-46ce-9473-d87fb7e67641 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:05:53 user nova-compute[71605]: WARNING nova.compute.manager [req-281cf31d-7082-480f-8c1c-ba31cb113077 req-febb4c65-8c16-4355-a0c7-c793f91f13b6 service nova] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Received unexpected event network-vif-plugged-9e814c79-86f6-46ce-9473-d87fb7e67641 for instance with vm_state building and task_state spawning. Apr 20 16:05:53 user nova-compute[71605]: DEBUG nova.network.neutron [req-37ffeb6b-a261-454a-84c4-e50909a5c4a8 req-aec5403c-a664-4b53-a164-def50481aa27 service nova] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Updated VIF entry in instance network info cache for port cac4dfaa-510a-4330-b9b1-aeb25f57abef. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG nova.network.neutron [req-37ffeb6b-a261-454a-84c4-e50909a5c4a8 req-aec5403c-a664-4b53-a164-def50481aa27 service nova] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Updating instance_info_cache with network_info: [{"id": "cac4dfaa-510a-4330-b9b1-aeb25f57abef", "address": "fa:16:3e:97:a1:b9", "network": {"id": "545a57d8-9d55-4ace-a0ad-635d7bc0ae52", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1085059550-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0a5eb3796a4d3a871843f409c6ffbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcac4dfaa-51", "ovs_interfaceid": "cac4dfaa-510a-4330-b9b1-aeb25f57abef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-37ffeb6b-a261-454a-84c4-e50909a5c4a8 req-aec5403c-a664-4b53-a164-def50481aa27 service nova] Releasing lock "refresh_cache-a760987f-1a65-4e42-8cef-73db9ef2db48" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Resumed> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:05:53 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] VM Resumed (Lifecycle Event) Apr 20 16:05:53 user nova-compute[71605]: DEBUG nova.compute.manager [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Instance event wait completed in 0 seconds for {{(pid=71605) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Guest created on hypervisor {{(pid=71605) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:05:53 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Instance spawned successfully. Apr 20 16:05:53 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Found default for hw_cdrom_bus of ide {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Found default for hw_disk_bus of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Found default for hw_input_bus of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Found default for hw_pointer_model of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Found default for hw_video_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Found default for hw_vif_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:05:53 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:05:53 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Started> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:05:53 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] VM Started (Lifecycle Event) Apr 20 16:05:53 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:53 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:05:53 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:53 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:53 user nova-compute[71605]: INFO nova.compute.manager [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Took 8.87 seconds to spawn the instance on the hypervisor. Apr 20 16:05:53 user nova-compute[71605]: DEBUG nova.compute.manager [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:05:54 user nova-compute[71605]: INFO nova.compute.manager [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Took 12.04 seconds to build instance. Apr 20 16:05:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-38980c95-62f3-4bd1-a667-16fa8265d2e7 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "c2b84ca2-f67b-4219-b7e6-18d2029e998a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 12.192s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:54 user nova-compute[71605]: DEBUG nova.network.neutron [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Successfully created port: e068d7e5-dc70-4b18-8dd6-5726f7a3bc84 {{(pid=71605) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 16:05:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.network.neutron [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Successfully updated port: 989ee5cd-ff10-4bcc-9b11-017b23299187 {{(pid=71605) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Acquiring lock "refresh_cache-a145fb51-4ca5-4cc4-b8bd-cd3665bef473" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Acquired lock "refresh_cache-a145fb51-4ca5-4cc4-b8bd-cd3665bef473" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.network.neutron [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Building network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.compute.manager [req-5dfcb976-321a-477e-a7a3-ae0e38e7de52 req-2851231d-0fe4-44ba-91c2-37b795e62904 service nova] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Received event network-vif-plugged-cac4dfaa-510a-4330-b9b1-aeb25f57abef {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-5dfcb976-321a-477e-a7a3-ae0e38e7de52 req-2851231d-0fe4-44ba-91c2-37b795e62904 service nova] Acquiring lock "a760987f-1a65-4e42-8cef-73db9ef2db48-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-5dfcb976-321a-477e-a7a3-ae0e38e7de52 req-2851231d-0fe4-44ba-91c2-37b795e62904 service nova] Lock "a760987f-1a65-4e42-8cef-73db9ef2db48-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-5dfcb976-321a-477e-a7a3-ae0e38e7de52 req-2851231d-0fe4-44ba-91c2-37b795e62904 service nova] Lock "a760987f-1a65-4e42-8cef-73db9ef2db48-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.compute.manager [req-5dfcb976-321a-477e-a7a3-ae0e38e7de52 req-2851231d-0fe4-44ba-91c2-37b795e62904 service nova] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] No waiting events found dispatching network-vif-plugged-cac4dfaa-510a-4330-b9b1-aeb25f57abef {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:05:55 user nova-compute[71605]: WARNING nova.compute.manager [req-5dfcb976-321a-477e-a7a3-ae0e38e7de52 req-2851231d-0fe4-44ba-91c2-37b795e62904 service nova] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Received unexpected event network-vif-plugged-cac4dfaa-510a-4330-b9b1-aeb25f57abef for instance with vm_state building and task_state spawning. Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.compute.manager [req-5dfcb976-321a-477e-a7a3-ae0e38e7de52 req-2851231d-0fe4-44ba-91c2-37b795e62904 service nova] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Received event network-vif-plugged-cac4dfaa-510a-4330-b9b1-aeb25f57abef {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-5dfcb976-321a-477e-a7a3-ae0e38e7de52 req-2851231d-0fe4-44ba-91c2-37b795e62904 service nova] Acquiring lock "a760987f-1a65-4e42-8cef-73db9ef2db48-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-5dfcb976-321a-477e-a7a3-ae0e38e7de52 req-2851231d-0fe4-44ba-91c2-37b795e62904 service nova] Lock "a760987f-1a65-4e42-8cef-73db9ef2db48-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-5dfcb976-321a-477e-a7a3-ae0e38e7de52 req-2851231d-0fe4-44ba-91c2-37b795e62904 service nova] Lock "a760987f-1a65-4e42-8cef-73db9ef2db48-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.003s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.compute.manager [req-5dfcb976-321a-477e-a7a3-ae0e38e7de52 req-2851231d-0fe4-44ba-91c2-37b795e62904 service nova] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] No waiting events found dispatching network-vif-plugged-cac4dfaa-510a-4330-b9b1-aeb25f57abef {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:05:55 user nova-compute[71605]: WARNING nova.compute.manager [req-5dfcb976-321a-477e-a7a3-ae0e38e7de52 req-2851231d-0fe4-44ba-91c2-37b795e62904 service nova] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Received unexpected event network-vif-plugged-cac4dfaa-510a-4330-b9b1-aeb25f57abef for instance with vm_state building and task_state spawning. Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.compute.manager [req-5dfcb976-321a-477e-a7a3-ae0e38e7de52 req-2851231d-0fe4-44ba-91c2-37b795e62904 service nova] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Received event network-changed-989ee5cd-ff10-4bcc-9b11-017b23299187 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.compute.manager [req-5dfcb976-321a-477e-a7a3-ae0e38e7de52 req-2851231d-0fe4-44ba-91c2-37b795e62904 service nova] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Refreshing instance network info cache due to event network-changed-989ee5cd-ff10-4bcc-9b11-017b23299187. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-5dfcb976-321a-477e-a7a3-ae0e38e7de52 req-2851231d-0fe4-44ba-91c2-37b795e62904 service nova] Acquiring lock "refresh_cache-a145fb51-4ca5-4cc4-b8bd-cd3665bef473" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.network.neutron [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Instance cache missing network info. {{(pid=71605) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:05:55 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] VM Stopped (Lifecycle Event) Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f9a5cfdd-0ee8-4d31-b1d3-33df3d3b2596 None None] [instance: e1036e0f-683f-4dfd-b0ad-6187d90ff2f6] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.network.neutron [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Updating instance_info_cache with network_info: [{"id": "989ee5cd-ff10-4bcc-9b11-017b23299187", "address": "fa:16:3e:92:5b:74", "network": {"id": "40132b20-6bfd-4f5a-8f6f-75769961d157", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-683065417-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a92cea9e1182477ca669c506b42eda60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap989ee5cd-ff", "ovs_interfaceid": "989ee5cd-ff10-4bcc-9b11-017b23299187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Releasing lock "refresh_cache-a145fb51-4ca5-4cc4-b8bd-cd3665bef473" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.compute.manager [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Instance network_info: |[{"id": "989ee5cd-ff10-4bcc-9b11-017b23299187", "address": "fa:16:3e:92:5b:74", "network": {"id": "40132b20-6bfd-4f5a-8f6f-75769961d157", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-683065417-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a92cea9e1182477ca669c506b42eda60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap989ee5cd-ff", "ovs_interfaceid": "989ee5cd-ff10-4bcc-9b11-017b23299187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-5dfcb976-321a-477e-a7a3-ae0e38e7de52 req-2851231d-0fe4-44ba-91c2-37b795e62904 service nova] Acquired lock "refresh_cache-a145fb51-4ca5-4cc4-b8bd-cd3665bef473" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.network.neutron [req-5dfcb976-321a-477e-a7a3-ae0e38e7de52 req-2851231d-0fe4-44ba-91c2-37b795e62904 service nova] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Refreshing network info cache for port 989ee5cd-ff10-4bcc-9b11-017b23299187 {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Start _get_guest_xml network_info=[{"id": "989ee5cd-ff10-4bcc-9b11-017b23299187", "address": "fa:16:3e:92:5b:74", "network": {"id": "40132b20-6bfd-4f5a-8f6f-75769961d157", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-683065417-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a92cea9e1182477ca669c506b42eda60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap989ee5cd-ff", "ovs_interfaceid": "989ee5cd-ff10-4bcc-9b11-017b23299187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'size': 0, 'encryption_format': None, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'image_id': '4ac69ea5-e5d7-40c8-864e-0a164d78a727'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 16:05:55 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:05:55 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71605) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T16:00:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=), allow threads: True {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Flavor limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Image limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Flavor pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Image pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Got 1 possible topologies {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:05:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1165125210',display_name='tempest-VolumesAdminNegativeTest-server-1165125210',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-1165125210',id=14,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a92cea9e1182477ca669c506b42eda60',ramdisk_id='',reservation_id='r-f72ecp4q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-978356230',owner_user_name='tempest-VolumesAdminNegativeTest-978356230-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:05:51Z,user_data=None,user_id='c92692a1d38b4531a4e7f42660a54c7b',uuid=a145fb51-4ca5-4cc4-b8bd-cd3665bef473,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "989ee5cd-ff10-4bcc-9b11-017b23299187", "address": "fa:16:3e:92:5b:74", "network": {"id": "40132b20-6bfd-4f5a-8f6f-75769961d157", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-683065417-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a92cea9e1182477ca669c506b42eda60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap989ee5cd-ff", "ovs_interfaceid": "989ee5cd-ff10-4bcc-9b11-017b23299187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71605) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Converting VIF {"id": "989ee5cd-ff10-4bcc-9b11-017b23299187", "address": "fa:16:3e:92:5b:74", "network": {"id": "40132b20-6bfd-4f5a-8f6f-75769961d157", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-683065417-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a92cea9e1182477ca669c506b42eda60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap989ee5cd-ff", "ovs_interfaceid": "989ee5cd-ff10-4bcc-9b11-017b23299187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:92:5b:74,bridge_name='br-int',has_traffic_filtering=True,id=989ee5cd-ff10-4bcc-9b11-017b23299187,network=Network(40132b20-6bfd-4f5a-8f6f-75769961d157),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap989ee5cd-ff') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.objects.instance [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lazy-loading 'pci_devices' on Instance uuid a145fb51-4ca5-4cc4-b8bd-cd3665bef473 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] End _get_guest_xml xml= Apr 20 16:05:55 user nova-compute[71605]: a145fb51-4ca5-4cc4-b8bd-cd3665bef473 Apr 20 16:05:55 user nova-compute[71605]: instance-0000000e Apr 20 16:05:55 user nova-compute[71605]: 131072 Apr 20 16:05:55 user nova-compute[71605]: 1 Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: tempest-VolumesAdminNegativeTest-server-1165125210 Apr 20 16:05:55 user nova-compute[71605]: 2023-04-20 16:05:55 Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: 128 Apr 20 16:05:55 user nova-compute[71605]: 1 Apr 20 16:05:55 user nova-compute[71605]: 0 Apr 20 16:05:55 user nova-compute[71605]: 0 Apr 20 16:05:55 user nova-compute[71605]: 1 Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: tempest-VolumesAdminNegativeTest-978356230-project-member Apr 20 16:05:55 user nova-compute[71605]: tempest-VolumesAdminNegativeTest-978356230 Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: OpenStack Foundation Apr 20 16:05:55 user nova-compute[71605]: OpenStack Nova Apr 20 16:05:55 user nova-compute[71605]: 0.0.0 Apr 20 16:05:55 user nova-compute[71605]: a145fb51-4ca5-4cc4-b8bd-cd3665bef473 Apr 20 16:05:55 user nova-compute[71605]: a145fb51-4ca5-4cc4-b8bd-cd3665bef473 Apr 20 16:05:55 user nova-compute[71605]: Virtual Machine Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: hvm Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Nehalem Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: /dev/urandom Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: Apr 20 16:05:55 user nova-compute[71605]: {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:05:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1165125210',display_name='tempest-VolumesAdminNegativeTest-server-1165125210',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-1165125210',id=14,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a92cea9e1182477ca669c506b42eda60',ramdisk_id='',reservation_id='r-f72ecp4q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-978356230',owner_user_name='tempest-VolumesAdminNegativeTest-978356230-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:05:51Z,user_data=None,user_id='c92692a1d38b4531a4e7f42660a54c7b',uuid=a145fb51-4ca5-4cc4-b8bd-cd3665bef473,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "989ee5cd-ff10-4bcc-9b11-017b23299187", "address": "fa:16:3e:92:5b:74", "network": {"id": "40132b20-6bfd-4f5a-8f6f-75769961d157", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-683065417-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a92cea9e1182477ca669c506b42eda60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap989ee5cd-ff", "ovs_interfaceid": "989ee5cd-ff10-4bcc-9b11-017b23299187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Converting VIF {"id": "989ee5cd-ff10-4bcc-9b11-017b23299187", "address": "fa:16:3e:92:5b:74", "network": {"id": "40132b20-6bfd-4f5a-8f6f-75769961d157", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-683065417-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a92cea9e1182477ca669c506b42eda60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap989ee5cd-ff", "ovs_interfaceid": "989ee5cd-ff10-4bcc-9b11-017b23299187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:92:5b:74,bridge_name='br-int',has_traffic_filtering=True,id=989ee5cd-ff10-4bcc-9b11-017b23299187,network=Network(40132b20-6bfd-4f5a-8f6f-75769961d157),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap989ee5cd-ff') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG os_vif [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:5b:74,bridge_name='br-int',has_traffic_filtering=True,id=989ee5cd-ff10-4bcc-9b11-017b23299187,network=Network(40132b20-6bfd-4f5a-8f6f-75769961d157),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap989ee5cd-ff') {{(pid=71605) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap989ee5cd-ff, may_exist=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap989ee5cd-ff, col_values=(('external_ids', {'iface-id': '989ee5cd-ff10-4bcc-9b11-017b23299187', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:92:5b:74', 'vm-uuid': 'a145fb51-4ca5-4cc4-b8bd-cd3665bef473'}),)) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:05:55 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:55 user nova-compute[71605]: INFO os_vif [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:5b:74,bridge_name='br-int',has_traffic_filtering=True,id=989ee5cd-ff10-4bcc-9b11-017b23299187,network=Network(40132b20-6bfd-4f5a-8f6f-75769961d157),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap989ee5cd-ff') Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] No BDM found with device name vda, not building metadata. {{(pid=71605) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] No VIF found with MAC fa:16:3e:92:5b:74, not building metadata {{(pid=71605) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.network.neutron [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Successfully updated port: e068d7e5-dc70-4b18-8dd6-5726f7a3bc84 {{(pid=71605) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Acquiring lock "refresh_cache-15d42ba7-cf47-4374-83b5-06d5242951b7" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Acquired lock "refresh_cache-15d42ba7-cf47-4374-83b5-06d5242951b7" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.network.neutron [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Building network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.network.neutron [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Instance cache missing network info. {{(pid=71605) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.compute.manager [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Instance event wait completed in 0 seconds for {{(pid=71605) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Guest created on hypervisor {{(pid=71605) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Resumed> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:05:56 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] VM Resumed (Lifecycle Event) Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:05:56 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Instance spawned successfully. Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Found default for hw_cdrom_bus of ide {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Found default for hw_disk_bus of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Found default for hw_input_bus of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Found default for hw_pointer_model of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Found default for hw_video_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Found default for hw_vif_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:05:56 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Started> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:05:56 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] VM Started (Lifecycle Event) Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:05:56 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:05:56 user nova-compute[71605]: INFO nova.compute.manager [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Took 9.65 seconds to spawn the instance on the hypervisor. Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.compute.manager [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:05:56 user nova-compute[71605]: INFO nova.compute.manager [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Took 11.42 seconds to build instance. Apr 20 16:05:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-2c704a99-6f91-4bca-89fe-d39409556b75 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "a760987f-1a65-4e42-8cef-73db9ef2db48" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.608s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.network.neutron [req-5dfcb976-321a-477e-a7a3-ae0e38e7de52 req-2851231d-0fe4-44ba-91c2-37b795e62904 service nova] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Updated VIF entry in instance network info cache for port 989ee5cd-ff10-4bcc-9b11-017b23299187. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.network.neutron [req-5dfcb976-321a-477e-a7a3-ae0e38e7de52 req-2851231d-0fe4-44ba-91c2-37b795e62904 service nova] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Updating instance_info_cache with network_info: [{"id": "989ee5cd-ff10-4bcc-9b11-017b23299187", "address": "fa:16:3e:92:5b:74", "network": {"id": "40132b20-6bfd-4f5a-8f6f-75769961d157", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-683065417-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a92cea9e1182477ca669c506b42eda60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap989ee5cd-ff", "ovs_interfaceid": "989ee5cd-ff10-4bcc-9b11-017b23299187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-5dfcb976-321a-477e-a7a3-ae0e38e7de52 req-2851231d-0fe4-44ba-91c2-37b795e62904 service nova] Releasing lock "refresh_cache-a145fb51-4ca5-4cc4-b8bd-cd3665bef473" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.network.neutron [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Updating instance_info_cache with network_info: [{"id": "e068d7e5-dc70-4b18-8dd6-5726f7a3bc84", "address": "fa:16:3e:15:a2:f4", "network": {"id": "9de26342-0f6c-4d7d-96a5-d4ad35573211", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1378273293-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fbd2a72dddad4f2892243a33df4fa2d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape068d7e5-dc", "ovs_interfaceid": "e068d7e5-dc70-4b18-8dd6-5726f7a3bc84", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Releasing lock "refresh_cache-15d42ba7-cf47-4374-83b5-06d5242951b7" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.compute.manager [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Instance network_info: |[{"id": "e068d7e5-dc70-4b18-8dd6-5726f7a3bc84", "address": "fa:16:3e:15:a2:f4", "network": {"id": "9de26342-0f6c-4d7d-96a5-d4ad35573211", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1378273293-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fbd2a72dddad4f2892243a33df4fa2d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape068d7e5-dc", "ovs_interfaceid": "e068d7e5-dc70-4b18-8dd6-5726f7a3bc84", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Start _get_guest_xml network_info=[{"id": "e068d7e5-dc70-4b18-8dd6-5726f7a3bc84", "address": "fa:16:3e:15:a2:f4", "network": {"id": "9de26342-0f6c-4d7d-96a5-d4ad35573211", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1378273293-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fbd2a72dddad4f2892243a33df4fa2d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape068d7e5-dc", "ovs_interfaceid": "e068d7e5-dc70-4b18-8dd6-5726f7a3bc84", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'size': 0, 'encryption_format': None, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'image_id': '4ac69ea5-e5d7-40c8-864e-0a164d78a727'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 16:05:56 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:05:56 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71605) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T16:00:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=), allow threads: True {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Flavor limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Image limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Flavor pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Image pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Got 1 possible topologies {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:05:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-756269820',display_name='tempest-ServerActionsTestJSON-server-756269820',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serveractionstestjson-server-756269820',id=15,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMjwUpDWFcob5xB4VqDuXPX9FXT3Oo4If754w5lrosRMsv11HN44JSOF4mrro0tvAJdzBl68kfqgDpMmfJchN9rJpHKumya051JNHX7iD1cSwO0dYRTlSqqNhb1fgqIedQ==',key_name='tempest-keypair-1949594234',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fbd2a72dddad4f2892243a33df4fa2d1',ramdisk_id='',reservation_id='r-wsyefwrb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-893965653',owner_user_name='tempest-ServerActionsTestJSON-893965653-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:05:52Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='dd6dee2194d04f45a81fd0ef45ca0632',uuid=15d42ba7-cf47-4374-83b5-06d5242951b7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e068d7e5-dc70-4b18-8dd6-5726f7a3bc84", "address": "fa:16:3e:15:a2:f4", "network": {"id": "9de26342-0f6c-4d7d-96a5-d4ad35573211", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1378273293-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fbd2a72dddad4f2892243a33df4fa2d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape068d7e5-dc", "ovs_interfaceid": "e068d7e5-dc70-4b18-8dd6-5726f7a3bc84", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71605) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Converting VIF {"id": "e068d7e5-dc70-4b18-8dd6-5726f7a3bc84", "address": "fa:16:3e:15:a2:f4", "network": {"id": "9de26342-0f6c-4d7d-96a5-d4ad35573211", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1378273293-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fbd2a72dddad4f2892243a33df4fa2d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape068d7e5-dc", "ovs_interfaceid": "e068d7e5-dc70-4b18-8dd6-5726f7a3bc84", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:a2:f4,bridge_name='br-int',has_traffic_filtering=True,id=e068d7e5-dc70-4b18-8dd6-5726f7a3bc84,network=Network(9de26342-0f6c-4d7d-96a5-d4ad35573211),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape068d7e5-dc') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.objects.instance [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Lazy-loading 'pci_devices' on Instance uuid 15d42ba7-cf47-4374-83b5-06d5242951b7 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] End _get_guest_xml xml= Apr 20 16:05:56 user nova-compute[71605]: 15d42ba7-cf47-4374-83b5-06d5242951b7 Apr 20 16:05:56 user nova-compute[71605]: instance-0000000f Apr 20 16:05:56 user nova-compute[71605]: 131072 Apr 20 16:05:56 user nova-compute[71605]: 1 Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: tempest-ServerActionsTestJSON-server-756269820 Apr 20 16:05:56 user nova-compute[71605]: 2023-04-20 16:05:56 Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: 128 Apr 20 16:05:56 user nova-compute[71605]: 1 Apr 20 16:05:56 user nova-compute[71605]: 0 Apr 20 16:05:56 user nova-compute[71605]: 0 Apr 20 16:05:56 user nova-compute[71605]: 1 Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: tempest-ServerActionsTestJSON-893965653-project-member Apr 20 16:05:56 user nova-compute[71605]: tempest-ServerActionsTestJSON-893965653 Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: OpenStack Foundation Apr 20 16:05:56 user nova-compute[71605]: OpenStack Nova Apr 20 16:05:56 user nova-compute[71605]: 0.0.0 Apr 20 16:05:56 user nova-compute[71605]: 15d42ba7-cf47-4374-83b5-06d5242951b7 Apr 20 16:05:56 user nova-compute[71605]: 15d42ba7-cf47-4374-83b5-06d5242951b7 Apr 20 16:05:56 user nova-compute[71605]: Virtual Machine Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: hvm Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Nehalem Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: /dev/urandom Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: Apr 20 16:05:56 user nova-compute[71605]: {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:05:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-756269820',display_name='tempest-ServerActionsTestJSON-server-756269820',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serveractionstestjson-server-756269820',id=15,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMjwUpDWFcob5xB4VqDuXPX9FXT3Oo4If754w5lrosRMsv11HN44JSOF4mrro0tvAJdzBl68kfqgDpMmfJchN9rJpHKumya051JNHX7iD1cSwO0dYRTlSqqNhb1fgqIedQ==',key_name='tempest-keypair-1949594234',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fbd2a72dddad4f2892243a33df4fa2d1',ramdisk_id='',reservation_id='r-wsyefwrb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-893965653',owner_user_name='tempest-ServerActionsTestJSON-893965653-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:05:52Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='dd6dee2194d04f45a81fd0ef45ca0632',uuid=15d42ba7-cf47-4374-83b5-06d5242951b7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e068d7e5-dc70-4b18-8dd6-5726f7a3bc84", "address": "fa:16:3e:15:a2:f4", "network": {"id": "9de26342-0f6c-4d7d-96a5-d4ad35573211", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1378273293-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fbd2a72dddad4f2892243a33df4fa2d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape068d7e5-dc", "ovs_interfaceid": "e068d7e5-dc70-4b18-8dd6-5726f7a3bc84", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Converting VIF {"id": "e068d7e5-dc70-4b18-8dd6-5726f7a3bc84", "address": "fa:16:3e:15:a2:f4", "network": {"id": "9de26342-0f6c-4d7d-96a5-d4ad35573211", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1378273293-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fbd2a72dddad4f2892243a33df4fa2d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape068d7e5-dc", "ovs_interfaceid": "e068d7e5-dc70-4b18-8dd6-5726f7a3bc84", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:a2:f4,bridge_name='br-int',has_traffic_filtering=True,id=e068d7e5-dc70-4b18-8dd6-5726f7a3bc84,network=Network(9de26342-0f6c-4d7d-96a5-d4ad35573211),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape068d7e5-dc') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG os_vif [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:a2:f4,bridge_name='br-int',has_traffic_filtering=True,id=e068d7e5-dc70-4b18-8dd6-5726f7a3bc84,network=Network(9de26342-0f6c-4d7d-96a5-d4ad35573211),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape068d7e5-dc') {{(pid=71605) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape068d7e5-dc, may_exist=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape068d7e5-dc, col_values=(('external_ids', {'iface-id': 'e068d7e5-dc70-4b18-8dd6-5726f7a3bc84', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:15:a2:f4', 'vm-uuid': '15d42ba7-cf47-4374-83b5-06d5242951b7'}),)) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:56 user nova-compute[71605]: INFO os_vif [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:a2:f4,bridge_name='br-int',has_traffic_filtering=True,id=e068d7e5-dc70-4b18-8dd6-5726f7a3bc84,network=Network(9de26342-0f6c-4d7d-96a5-d4ad35573211),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape068d7e5-dc') Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] No BDM found with device name vda, not building metadata. {{(pid=71605) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 16:05:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] No VIF found with MAC fa:16:3e:15:a2:f4, not building metadata {{(pid=71605) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 16:05:57 user nova-compute[71605]: DEBUG nova.compute.manager [req-ce4162ad-f1b4-437b-8f65-d114db990c03 req-1c26f8a9-a3c3-4a27-be68-7568729f29dc service nova] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Received event network-changed-e068d7e5-dc70-4b18-8dd6-5726f7a3bc84 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:05:57 user nova-compute[71605]: DEBUG nova.compute.manager [req-ce4162ad-f1b4-437b-8f65-d114db990c03 req-1c26f8a9-a3c3-4a27-be68-7568729f29dc service nova] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Refreshing instance network info cache due to event network-changed-e068d7e5-dc70-4b18-8dd6-5726f7a3bc84. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:05:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-ce4162ad-f1b4-437b-8f65-d114db990c03 req-1c26f8a9-a3c3-4a27-be68-7568729f29dc service nova] Acquiring lock "refresh_cache-15d42ba7-cf47-4374-83b5-06d5242951b7" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:05:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-ce4162ad-f1b4-437b-8f65-d114db990c03 req-1c26f8a9-a3c3-4a27-be68-7568729f29dc service nova] Acquired lock "refresh_cache-15d42ba7-cf47-4374-83b5-06d5242951b7" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:05:57 user nova-compute[71605]: DEBUG nova.network.neutron [req-ce4162ad-f1b4-437b-8f65-d114db990c03 req-1c26f8a9-a3c3-4a27-be68-7568729f29dc service nova] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Refreshing network info cache for port e068d7e5-dc70-4b18-8dd6-5726f7a3bc84 {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:05:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:58 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:58 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:58 user nova-compute[71605]: DEBUG nova.compute.manager [req-2224ad67-9b34-475d-81b8-8f7ac516132f req-c9287b9e-18d6-41d3-a61c-511f590737c7 service nova] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Received event network-vif-plugged-989ee5cd-ff10-4bcc-9b11-017b23299187 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:05:58 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-2224ad67-9b34-475d-81b8-8f7ac516132f req-c9287b9e-18d6-41d3-a61c-511f590737c7 service nova] Acquiring lock "a145fb51-4ca5-4cc4-b8bd-cd3665bef473-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:58 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-2224ad67-9b34-475d-81b8-8f7ac516132f req-c9287b9e-18d6-41d3-a61c-511f590737c7 service nova] Lock "a145fb51-4ca5-4cc4-b8bd-cd3665bef473-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:58 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-2224ad67-9b34-475d-81b8-8f7ac516132f req-c9287b9e-18d6-41d3-a61c-511f590737c7 service nova] Lock "a145fb51-4ca5-4cc4-b8bd-cd3665bef473-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:58 user nova-compute[71605]: DEBUG nova.compute.manager [req-2224ad67-9b34-475d-81b8-8f7ac516132f req-c9287b9e-18d6-41d3-a61c-511f590737c7 service nova] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] No waiting events found dispatching network-vif-plugged-989ee5cd-ff10-4bcc-9b11-017b23299187 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:05:58 user nova-compute[71605]: WARNING nova.compute.manager [req-2224ad67-9b34-475d-81b8-8f7ac516132f req-c9287b9e-18d6-41d3-a61c-511f590737c7 service nova] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Received unexpected event network-vif-plugged-989ee5cd-ff10-4bcc-9b11-017b23299187 for instance with vm_state building and task_state spawning. Apr 20 16:05:58 user nova-compute[71605]: DEBUG nova.network.neutron [req-ce4162ad-f1b4-437b-8f65-d114db990c03 req-1c26f8a9-a3c3-4a27-be68-7568729f29dc service nova] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Updated VIF entry in instance network info cache for port e068d7e5-dc70-4b18-8dd6-5726f7a3bc84. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:05:58 user nova-compute[71605]: DEBUG nova.network.neutron [req-ce4162ad-f1b4-437b-8f65-d114db990c03 req-1c26f8a9-a3c3-4a27-be68-7568729f29dc service nova] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Updating instance_info_cache with network_info: [{"id": "e068d7e5-dc70-4b18-8dd6-5726f7a3bc84", "address": "fa:16:3e:15:a2:f4", "network": {"id": "9de26342-0f6c-4d7d-96a5-d4ad35573211", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1378273293-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fbd2a72dddad4f2892243a33df4fa2d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape068d7e5-dc", "ovs_interfaceid": "e068d7e5-dc70-4b18-8dd6-5726f7a3bc84", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:05:58 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-ce4162ad-f1b4-437b-8f65-d114db990c03 req-1c26f8a9-a3c3-4a27-be68-7568729f29dc service nova] Releasing lock "refresh_cache-15d42ba7-cf47-4374-83b5-06d5242951b7" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:05:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:59 user nova-compute[71605]: DEBUG nova.compute.manager [req-dc70b210-db2a-4f3d-9fa9-47e550da6e0d req-86842ffd-e5ff-441b-976a-8d51e864a039 service nova] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Received event network-vif-plugged-e068d7e5-dc70-4b18-8dd6-5726f7a3bc84 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:05:59 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-dc70b210-db2a-4f3d-9fa9-47e550da6e0d req-86842ffd-e5ff-441b-976a-8d51e864a039 service nova] Acquiring lock "15d42ba7-cf47-4374-83b5-06d5242951b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:05:59 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-dc70b210-db2a-4f3d-9fa9-47e550da6e0d req-86842ffd-e5ff-441b-976a-8d51e864a039 service nova] Lock "15d42ba7-cf47-4374-83b5-06d5242951b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:05:59 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-dc70b210-db2a-4f3d-9fa9-47e550da6e0d req-86842ffd-e5ff-441b-976a-8d51e864a039 service nova] Lock "15d42ba7-cf47-4374-83b5-06d5242951b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:05:59 user nova-compute[71605]: DEBUG nova.compute.manager [req-dc70b210-db2a-4f3d-9fa9-47e550da6e0d req-86842ffd-e5ff-441b-976a-8d51e864a039 service nova] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] No waiting events found dispatching network-vif-plugged-e068d7e5-dc70-4b18-8dd6-5726f7a3bc84 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:05:59 user nova-compute[71605]: WARNING nova.compute.manager [req-dc70b210-db2a-4f3d-9fa9-47e550da6e0d req-86842ffd-e5ff-441b-976a-8d51e864a039 service nova] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Received unexpected event network-vif-plugged-e068d7e5-dc70-4b18-8dd6-5726f7a3bc84 for instance with vm_state building and task_state spawning. Apr 20 16:05:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:05:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:00 user nova-compute[71605]: DEBUG nova.compute.manager [req-bbeabd21-0e47-45a4-8d5a-bb910649a6a7 req-66045cd9-4351-481d-9090-52a5228d54b4 service nova] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Received event network-vif-plugged-989ee5cd-ff10-4bcc-9b11-017b23299187 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:06:00 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-bbeabd21-0e47-45a4-8d5a-bb910649a6a7 req-66045cd9-4351-481d-9090-52a5228d54b4 service nova] Acquiring lock "a145fb51-4ca5-4cc4-b8bd-cd3665bef473-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:06:00 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-bbeabd21-0e47-45a4-8d5a-bb910649a6a7 req-66045cd9-4351-481d-9090-52a5228d54b4 service nova] Lock "a145fb51-4ca5-4cc4-b8bd-cd3665bef473-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:06:00 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-bbeabd21-0e47-45a4-8d5a-bb910649a6a7 req-66045cd9-4351-481d-9090-52a5228d54b4 service nova] Lock "a145fb51-4ca5-4cc4-b8bd-cd3665bef473-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:06:00 user nova-compute[71605]: DEBUG nova.compute.manager [req-bbeabd21-0e47-45a4-8d5a-bb910649a6a7 req-66045cd9-4351-481d-9090-52a5228d54b4 service nova] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] No waiting events found dispatching network-vif-plugged-989ee5cd-ff10-4bcc-9b11-017b23299187 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:06:00 user nova-compute[71605]: WARNING nova.compute.manager [req-bbeabd21-0e47-45a4-8d5a-bb910649a6a7 req-66045cd9-4351-481d-9090-52a5228d54b4 service nova] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Received unexpected event network-vif-plugged-989ee5cd-ff10-4bcc-9b11-017b23299187 for instance with vm_state building and task_state spawning. Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Resumed> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:06:01 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] VM Resumed (Lifecycle Event) Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.compute.manager [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Instance event wait completed in 0 seconds for {{(pid=71605) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Guest created on hypervisor {{(pid=71605) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:06:01 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Instance spawned successfully. Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 16:06:01 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Started> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:06:01 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] VM Started (Lifecycle Event) Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Found default for hw_cdrom_bus of ide {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Found default for hw_disk_bus of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Found default for hw_input_bus of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Found default for hw_pointer_model of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Found default for hw_video_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Found default for hw_vif_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.compute.manager [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Instance event wait completed in 0 seconds for {{(pid=71605) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Guest created on hypervisor {{(pid=71605) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 16:06:01 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Instance spawned successfully. Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 16:06:01 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Resumed> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:06:01 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] VM Resumed (Lifecycle Event) Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Found default for hw_cdrom_bus of ide {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Found default for hw_disk_bus of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Found default for hw_input_bus of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Found default for hw_pointer_model of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Found default for hw_video_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Found default for hw_vif_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:06:01 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Started> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:06:01 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] VM Started (Lifecycle Event) Apr 20 16:06:01 user nova-compute[71605]: INFO nova.compute.manager [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Took 10.29 seconds to spawn the instance on the hypervisor. Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.compute.manager [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:06:01 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:06:01 user nova-compute[71605]: INFO nova.compute.manager [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Took 9.09 seconds to spawn the instance on the hypervisor. Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.compute.manager [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.compute.manager [req-2310c2f7-d723-4609-88f7-5745ee7f5811 req-b6359646-6b64-4e13-834f-e283b994c993 service nova] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Received event network-vif-plugged-e068d7e5-dc70-4b18-8dd6-5726f7a3bc84 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:06:01 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-2310c2f7-d723-4609-88f7-5745ee7f5811 req-b6359646-6b64-4e13-834f-e283b994c993 service nova] Acquiring lock "15d42ba7-cf47-4374-83b5-06d5242951b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:06:01 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-2310c2f7-d723-4609-88f7-5745ee7f5811 req-b6359646-6b64-4e13-834f-e283b994c993 service nova] Lock "15d42ba7-cf47-4374-83b5-06d5242951b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:06:01 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-2310c2f7-d723-4609-88f7-5745ee7f5811 req-b6359646-6b64-4e13-834f-e283b994c993 service nova] Lock "15d42ba7-cf47-4374-83b5-06d5242951b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:06:01 user nova-compute[71605]: DEBUG nova.compute.manager [req-2310c2f7-d723-4609-88f7-5745ee7f5811 req-b6359646-6b64-4e13-834f-e283b994c993 service nova] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] No waiting events found dispatching network-vif-plugged-e068d7e5-dc70-4b18-8dd6-5726f7a3bc84 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:06:01 user nova-compute[71605]: WARNING nova.compute.manager [req-2310c2f7-d723-4609-88f7-5745ee7f5811 req-b6359646-6b64-4e13-834f-e283b994c993 service nova] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Received unexpected event network-vif-plugged-e068d7e5-dc70-4b18-8dd6-5726f7a3bc84 for instance with vm_state building and task_state spawning. Apr 20 16:06:01 user nova-compute[71605]: INFO nova.compute.manager [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Took 11.28 seconds to build instance. Apr 20 16:06:01 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-fe38d8a9-aa47-4dc3-abb1-9ea4878bee76 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "a145fb51-4ca5-4cc4-b8bd-cd3665bef473" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.513s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:06:01 user nova-compute[71605]: INFO nova.compute.manager [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Took 10.52 seconds to build instance. Apr 20 16:06:01 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c84a10b6-cde9-4caf-9c19-f5e4efc9fe11 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Lock "15d42ba7-cf47-4374-83b5-06d5242951b7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.649s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:06:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:04 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:09 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:10 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:13 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:14 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:19 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:21 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:24 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:29 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:33 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Acquiring lock "3ac0a246-e2fe-4164-9bc1-c96bb94e396f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:06:33 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "3ac0a246-e2fe-4164-9bc1-c96bb94e396f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:06:33 user nova-compute[71605]: DEBUG nova.compute.manager [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Starting instance... {{(pid=71605) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 16:06:33 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:06:33 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:06:33 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71605) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 16:06:33 user nova-compute[71605]: INFO nova.compute.claims [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Claim successful on node user Apr 20 16:06:33 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:06:33 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:06:33 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.454s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:06:33 user nova-compute[71605]: DEBUG nova.compute.manager [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Start building networks asynchronously for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 16:06:33 user nova-compute[71605]: DEBUG nova.compute.manager [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Allocating IP information in the background. {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 16:06:33 user nova-compute[71605]: DEBUG nova.network.neutron [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] allocate_for_instance() {{(pid=71605) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 16:06:33 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 16:06:33 user nova-compute[71605]: DEBUG nova.compute.manager [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Start building block device mappings for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 16:06:33 user nova-compute[71605]: DEBUG nova.policy [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1c8f57b12bc749888ea89bdbee258811', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '77f831070f5847bda788f6f0fcfedb03', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71605) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 16:06:33 user nova-compute[71605]: DEBUG nova.compute.manager [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Start spawning the instance on the hypervisor. {{(pid=71605) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 16:06:33 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Creating instance directory {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 16:06:33 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Creating image(s) Apr 20 16:06:33 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Acquiring lock "/opt/stack/data/nova/instances/3ac0a246-e2fe-4164-9bc1-c96bb94e396f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:06:33 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "/opt/stack/data/nova/instances/3ac0a246-e2fe-4164-9bc1-c96bb94e396f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:06:33 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "/opt/stack/data/nova/instances/3ac0a246-e2fe-4164-9bc1-c96bb94e396f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:06:33 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:06:34 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.136s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:06:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Acquiring lock "4030659dc9e6940e4f224066d06e3784b1229890" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:06:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:06:34 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:06:34 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.138s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:06:34 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/3ac0a246-e2fe-4164-9bc1-c96bb94e396f/disk 1073741824 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:06:34 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/3ac0a246-e2fe-4164-9bc1-c96bb94e396f/disk 1073741824" returned: 0 in 0.047s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:06:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.191s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:06:34 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:06:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:34 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.138s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:06:34 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Checking if we can resize image /opt/stack/data/nova/instances/3ac0a246-e2fe-4164-9bc1-c96bb94e396f/disk. size=1073741824 {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 16:06:34 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3ac0a246-e2fe-4164-9bc1-c96bb94e396f/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:06:34 user nova-compute[71605]: DEBUG nova.network.neutron [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Successfully created port: 64234034-3bc7-49ec-adb2-d425da7301e7 {{(pid=71605) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 16:06:34 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3ac0a246-e2fe-4164-9bc1-c96bb94e396f/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:06:34 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Cannot resize image /opt/stack/data/nova/instances/3ac0a246-e2fe-4164-9bc1-c96bb94e396f/disk to a smaller size. {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 16:06:34 user nova-compute[71605]: DEBUG nova.objects.instance [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lazy-loading 'migration_context' on Instance uuid 3ac0a246-e2fe-4164-9bc1-c96bb94e396f {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:06:34 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Created local disks {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 16:06:34 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Ensure instance console log exists: /opt/stack/data/nova/instances/3ac0a246-e2fe-4164-9bc1-c96bb94e396f/console.log {{(pid=71605) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 16:06:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:06:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:06:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-002dcb00-027a-413f-942c-8a923cdef68a tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Acquiring lock "dd78d74a-11d6-4f06-8092-5088b3fad412" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-002dcb00-027a-413f-942c-8a923cdef68a tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "dd78d74a-11d6-4f06-8092-5088b3fad412" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-002dcb00-027a-413f-942c-8a923cdef68a tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Acquiring lock "dd78d74a-11d6-4f06-8092-5088b3fad412-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-002dcb00-027a-413f-942c-8a923cdef68a tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "dd78d74a-11d6-4f06-8092-5088b3fad412-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-002dcb00-027a-413f-942c-8a923cdef68a tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "dd78d74a-11d6-4f06-8092-5088b3fad412-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:06:35 user nova-compute[71605]: INFO nova.compute.manager [None req-002dcb00-027a-413f-942c-8a923cdef68a tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Terminating instance Apr 20 16:06:35 user nova-compute[71605]: DEBUG nova.compute.manager [None req-002dcb00-027a-413f-942c-8a923cdef68a tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Start destroying the instance on the hypervisor. {{(pid=71605) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG nova.compute.manager [req-ebdfa73e-6cf3-4f2a-87c7-48b259c9100a req-eb7bf867-7eb5-4944-b063-09a8341e43f8 service nova] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Received event network-vif-unplugged-14dcc4ff-4a09-446a-b0ea-d9989cd3fa16 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-ebdfa73e-6cf3-4f2a-87c7-48b259c9100a req-eb7bf867-7eb5-4944-b063-09a8341e43f8 service nova] Acquiring lock "dd78d74a-11d6-4f06-8092-5088b3fad412-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-ebdfa73e-6cf3-4f2a-87c7-48b259c9100a req-eb7bf867-7eb5-4944-b063-09a8341e43f8 service nova] Lock "dd78d74a-11d6-4f06-8092-5088b3fad412-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-ebdfa73e-6cf3-4f2a-87c7-48b259c9100a req-eb7bf867-7eb5-4944-b063-09a8341e43f8 service nova] Lock "dd78d74a-11d6-4f06-8092-5088b3fad412-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG nova.compute.manager [req-ebdfa73e-6cf3-4f2a-87c7-48b259c9100a req-eb7bf867-7eb5-4944-b063-09a8341e43f8 service nova] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] No waiting events found dispatching network-vif-unplugged-14dcc4ff-4a09-446a-b0ea-d9989cd3fa16 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG nova.compute.manager [req-ebdfa73e-6cf3-4f2a-87c7-48b259c9100a req-eb7bf867-7eb5-4944-b063-09a8341e43f8 service nova] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Received event network-vif-unplugged-14dcc4ff-4a09-446a-b0ea-d9989cd3fa16 for instance with task_state deleting. {{(pid=71605) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG nova.network.neutron [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Successfully updated port: 64234034-3bc7-49ec-adb2-d425da7301e7 {{(pid=71605) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG nova.compute.manager [req-b88a9552-41d8-4f79-a12e-9526b21a2e2c req-6b9a5286-9eeb-4577-aa90-e22196978386 service nova] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Received event network-changed-64234034-3bc7-49ec-adb2-d425da7301e7 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG nova.compute.manager [req-b88a9552-41d8-4f79-a12e-9526b21a2e2c req-6b9a5286-9eeb-4577-aa90-e22196978386 service nova] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Refreshing instance network info cache due to event network-changed-64234034-3bc7-49ec-adb2-d425da7301e7. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-b88a9552-41d8-4f79-a12e-9526b21a2e2c req-6b9a5286-9eeb-4577-aa90-e22196978386 service nova] Acquiring lock "refresh_cache-3ac0a246-e2fe-4164-9bc1-c96bb94e396f" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-b88a9552-41d8-4f79-a12e-9526b21a2e2c req-6b9a5286-9eeb-4577-aa90-e22196978386 service nova] Acquired lock "refresh_cache-3ac0a246-e2fe-4164-9bc1-c96bb94e396f" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG nova.network.neutron [req-b88a9552-41d8-4f79-a12e-9526b21a2e2c req-6b9a5286-9eeb-4577-aa90-e22196978386 service nova] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Refreshing network info cache for port 64234034-3bc7-49ec-adb2-d425da7301e7 {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Acquiring lock "refresh_cache-3ac0a246-e2fe-4164-9bc1-c96bb94e396f" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG nova.network.neutron [req-b88a9552-41d8-4f79-a12e-9526b21a2e2c req-6b9a5286-9eeb-4577-aa90-e22196978386 service nova] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Instance cache missing network info. {{(pid=71605) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 16:06:35 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Instance destroyed successfully. Apr 20 16:06:35 user nova-compute[71605]: DEBUG nova.objects.instance [None req-002dcb00-027a-413f-942c-8a923cdef68a tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lazy-loading 'resources' on Instance uuid dd78d74a-11d6-4f06-8092-5088b3fad412 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-002dcb00-027a-413f-942c-8a923cdef68a tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:04:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1136804593',display_name='tempest-ServersNegativeTestJSON-server-1136804593',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1136804593',id=11,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-20T16:04:55Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='d8444d3c8f554a56967917670b19dc37',ramdisk_id='',reservation_id='r-osnh458p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersNegativeTestJSON-942369263',owner_user_name='tempest-ServersNegativeTestJSON-942369263-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T16:04:55Z,user_data=None,user_id='9be25e958c6047068ab5ce63106b0754',uuid=dd78d74a-11d6-4f06-8092-5088b3fad412,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "14dcc4ff-4a09-446a-b0ea-d9989cd3fa16", "address": "fa:16:3e:7e:10:bd", "network": {"id": "c36830a6-66f7-4f28-8879-e228da46cead", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-655574662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8444d3c8f554a56967917670b19dc37", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap14dcc4ff-4a", "ovs_interfaceid": "14dcc4ff-4a09-446a-b0ea-d9989cd3fa16", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-002dcb00-027a-413f-942c-8a923cdef68a tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Converting VIF {"id": "14dcc4ff-4a09-446a-b0ea-d9989cd3fa16", "address": "fa:16:3e:7e:10:bd", "network": {"id": "c36830a6-66f7-4f28-8879-e228da46cead", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-655574662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8444d3c8f554a56967917670b19dc37", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap14dcc4ff-4a", "ovs_interfaceid": "14dcc4ff-4a09-446a-b0ea-d9989cd3fa16", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-002dcb00-027a-413f-942c-8a923cdef68a tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7e:10:bd,bridge_name='br-int',has_traffic_filtering=True,id=14dcc4ff-4a09-446a-b0ea-d9989cd3fa16,network=Network(c36830a6-66f7-4f28-8879-e228da46cead),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14dcc4ff-4a') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG os_vif [None req-002dcb00-027a-413f-942c-8a923cdef68a tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7e:10:bd,bridge_name='br-int',has_traffic_filtering=True,id=14dcc4ff-4a09-446a-b0ea-d9989cd3fa16,network=Network(c36830a6-66f7-4f28-8879-e228da46cead),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14dcc4ff-4a') {{(pid=71605) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap14dcc4ff-4a, bridge=br-int, if_exists=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:06:35 user nova-compute[71605]: INFO os_vif [None req-002dcb00-027a-413f-942c-8a923cdef68a tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7e:10:bd,bridge_name='br-int',has_traffic_filtering=True,id=14dcc4ff-4a09-446a-b0ea-d9989cd3fa16,network=Network(c36830a6-66f7-4f28-8879-e228da46cead),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap14dcc4ff-4a') Apr 20 16:06:35 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-002dcb00-027a-413f-942c-8a923cdef68a tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Deleting instance files /opt/stack/data/nova/instances/dd78d74a-11d6-4f06-8092-5088b3fad412_del Apr 20 16:06:35 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-002dcb00-027a-413f-942c-8a923cdef68a tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Deletion of /opt/stack/data/nova/instances/dd78d74a-11d6-4f06-8092-5088b3fad412_del complete Apr 20 16:06:35 user nova-compute[71605]: INFO nova.compute.manager [None req-002dcb00-027a-413f-942c-8a923cdef68a tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Took 0.71 seconds to destroy the instance on the hypervisor. Apr 20 16:06:35 user nova-compute[71605]: DEBUG oslo.service.loopingcall [None req-002dcb00-027a-413f-942c-8a923cdef68a tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71605) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG nova.compute.manager [-] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Deallocating network for instance {{(pid=71605) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] deallocate_for_instance() {{(pid=71605) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG nova.network.neutron [req-b88a9552-41d8-4f79-a12e-9526b21a2e2c req-6b9a5286-9eeb-4577-aa90-e22196978386 service nova] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-b88a9552-41d8-4f79-a12e-9526b21a2e2c req-6b9a5286-9eeb-4577-aa90-e22196978386 service nova] Releasing lock "refresh_cache-3ac0a246-e2fe-4164-9bc1-c96bb94e396f" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Acquired lock "refresh_cache-3ac0a246-e2fe-4164-9bc1-c96bb94e396f" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG nova.network.neutron [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Building network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 16:06:35 user nova-compute[71605]: DEBUG nova.network.neutron [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Instance cache missing network info. {{(pid=71605) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:06:36 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Took 0.70 seconds to deallocate network for instance. Apr 20 16:06:36 user nova-compute[71605]: DEBUG nova.network.neutron [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Updating instance_info_cache with network_info: [{"id": "64234034-3bc7-49ec-adb2-d425da7301e7", "address": "fa:16:3e:4b:81:b9", "network": {"id": "27275346-fa92-4114-a62b-d59f0212eb8f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-871140467-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "77f831070f5847bda788f6f0fcfedb03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap64234034-3b", "ovs_interfaceid": "64234034-3bc7-49ec-adb2-d425da7301e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Releasing lock "refresh_cache-3ac0a246-e2fe-4164-9bc1-c96bb94e396f" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG nova.compute.manager [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Instance network_info: |[{"id": "64234034-3bc7-49ec-adb2-d425da7301e7", "address": "fa:16:3e:4b:81:b9", "network": {"id": "27275346-fa92-4114-a62b-d59f0212eb8f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-871140467-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "77f831070f5847bda788f6f0fcfedb03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap64234034-3b", "ovs_interfaceid": "64234034-3bc7-49ec-adb2-d425da7301e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Start _get_guest_xml network_info=[{"id": "64234034-3bc7-49ec-adb2-d425da7301e7", "address": "fa:16:3e:4b:81:b9", "network": {"id": "27275346-fa92-4114-a62b-d59f0212eb8f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-871140467-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "77f831070f5847bda788f6f0fcfedb03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap64234034-3b", "ovs_interfaceid": "64234034-3bc7-49ec-adb2-d425da7301e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'size': 0, 'encryption_format': None, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'image_id': '4ac69ea5-e5d7-40c8-864e-0a164d78a727'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 16:06:36 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:06:36 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:06:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71605) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T16:00:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=), allow threads: True {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Flavor limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Image limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Flavor pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Image pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Got 1 possible topologies {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:06:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-673629515',display_name='tempest-AttachVolumeTestJSON-server-673629515',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-673629515',id=16,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGZM/GSdMLabrovqLAcYqsj7WFJ8JEyf+MdNfn+7QjGV1E8w98tErRtmHPGjmfT7XNg40a0X/HuPTbbuPZsBHAMaW5V6k6XIxdNK2JrY++eeL0UNW7ZwAqAXZ0rf7wYalg==',key_name='tempest-keypair-398342575',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='77f831070f5847bda788f6f0fcfedb03',ramdisk_id='',reservation_id='r-fhivp1qm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-1838780462',owner_user_name='tempest-AttachVolumeTestJSON-1838780462-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:06:34Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1c8f57b12bc749888ea89bdbee258811',uuid=3ac0a246-e2fe-4164-9bc1-c96bb94e396f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "64234034-3bc7-49ec-adb2-d425da7301e7", "address": "fa:16:3e:4b:81:b9", "network": {"id": "27275346-fa92-4114-a62b-d59f0212eb8f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-871140467-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "77f831070f5847bda788f6f0fcfedb03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap64234034-3b", "ovs_interfaceid": "64234034-3bc7-49ec-adb2-d425da7301e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71605) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Converting VIF {"id": "64234034-3bc7-49ec-adb2-d425da7301e7", "address": "fa:16:3e:4b:81:b9", "network": {"id": "27275346-fa92-4114-a62b-d59f0212eb8f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-871140467-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "77f831070f5847bda788f6f0fcfedb03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap64234034-3b", "ovs_interfaceid": "64234034-3bc7-49ec-adb2-d425da7301e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4b:81:b9,bridge_name='br-int',has_traffic_filtering=True,id=64234034-3bc7-49ec-adb2-d425da7301e7,network=Network(27275346-fa92-4114-a62b-d59f0212eb8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap64234034-3b') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG nova.objects.instance [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lazy-loading 'pci_devices' on Instance uuid 3ac0a246-e2fe-4164-9bc1-c96bb94e396f {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-002dcb00-027a-413f-942c-8a923cdef68a tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-002dcb00-027a-413f-942c-8a923cdef68a tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] End _get_guest_xml xml= Apr 20 16:06:36 user nova-compute[71605]: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f Apr 20 16:06:36 user nova-compute[71605]: instance-00000010 Apr 20 16:06:36 user nova-compute[71605]: 131072 Apr 20 16:06:36 user nova-compute[71605]: 1 Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: tempest-AttachVolumeTestJSON-server-673629515 Apr 20 16:06:36 user nova-compute[71605]: 2023-04-20 16:06:36 Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: 128 Apr 20 16:06:36 user nova-compute[71605]: 1 Apr 20 16:06:36 user nova-compute[71605]: 0 Apr 20 16:06:36 user nova-compute[71605]: 0 Apr 20 16:06:36 user nova-compute[71605]: 1 Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: tempest-AttachVolumeTestJSON-1838780462-project-member Apr 20 16:06:36 user nova-compute[71605]: tempest-AttachVolumeTestJSON-1838780462 Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: OpenStack Foundation Apr 20 16:06:36 user nova-compute[71605]: OpenStack Nova Apr 20 16:06:36 user nova-compute[71605]: 0.0.0 Apr 20 16:06:36 user nova-compute[71605]: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f Apr 20 16:06:36 user nova-compute[71605]: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f Apr 20 16:06:36 user nova-compute[71605]: Virtual Machine Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: hvm Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Nehalem Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: /dev/urandom Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: Apr 20 16:06:36 user nova-compute[71605]: {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:06:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-673629515',display_name='tempest-AttachVolumeTestJSON-server-673629515',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-673629515',id=16,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGZM/GSdMLabrovqLAcYqsj7WFJ8JEyf+MdNfn+7QjGV1E8w98tErRtmHPGjmfT7XNg40a0X/HuPTbbuPZsBHAMaW5V6k6XIxdNK2JrY++eeL0UNW7ZwAqAXZ0rf7wYalg==',key_name='tempest-keypair-398342575',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='77f831070f5847bda788f6f0fcfedb03',ramdisk_id='',reservation_id='r-fhivp1qm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-1838780462',owner_user_name='tempest-AttachVolumeTestJSON-1838780462-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:06:34Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1c8f57b12bc749888ea89bdbee258811',uuid=3ac0a246-e2fe-4164-9bc1-c96bb94e396f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "64234034-3bc7-49ec-adb2-d425da7301e7", "address": "fa:16:3e:4b:81:b9", "network": {"id": "27275346-fa92-4114-a62b-d59f0212eb8f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-871140467-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "77f831070f5847bda788f6f0fcfedb03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap64234034-3b", "ovs_interfaceid": "64234034-3bc7-49ec-adb2-d425da7301e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Converting VIF {"id": "64234034-3bc7-49ec-adb2-d425da7301e7", "address": "fa:16:3e:4b:81:b9", "network": {"id": "27275346-fa92-4114-a62b-d59f0212eb8f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-871140467-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "77f831070f5847bda788f6f0fcfedb03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap64234034-3b", "ovs_interfaceid": "64234034-3bc7-49ec-adb2-d425da7301e7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4b:81:b9,bridge_name='br-int',has_traffic_filtering=True,id=64234034-3bc7-49ec-adb2-d425da7301e7,network=Network(27275346-fa92-4114-a62b-d59f0212eb8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap64234034-3b') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG os_vif [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4b:81:b9,bridge_name='br-int',has_traffic_filtering=True,id=64234034-3bc7-49ec-adb2-d425da7301e7,network=Network(27275346-fa92-4114-a62b-d59f0212eb8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap64234034-3b') {{(pid=71605) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap64234034-3b, may_exist=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap64234034-3b, col_values=(('external_ids', {'iface-id': '64234034-3bc7-49ec-adb2-d425da7301e7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4b:81:b9', 'vm-uuid': '3ac0a246-e2fe-4164-9bc1-c96bb94e396f'}),)) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:36 user nova-compute[71605]: INFO os_vif [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4b:81:b9,bridge_name='br-int',has_traffic_filtering=True,id=64234034-3bc7-49ec-adb2-d425da7301e7,network=Network(27275346-fa92-4114-a62b-d59f0212eb8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap64234034-3b') Apr 20 16:06:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] No BDM found with device name vda, not building metadata. {{(pid=71605) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] No VIF found with MAC fa:16:3e:4b:81:b9, not building metadata {{(pid=71605) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-002dcb00-027a-413f-942c-8a923cdef68a tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-002dcb00-027a-413f-942c-8a923cdef68a tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:06:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-002dcb00-027a-413f-942c-8a923cdef68a tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.338s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:06:36 user nova-compute[71605]: INFO nova.scheduler.client.report [None req-002dcb00-027a-413f-942c-8a923cdef68a tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Deleted allocations for instance dd78d74a-11d6-4f06-8092-5088b3fad412 Apr 20 16:06:37 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-002dcb00-027a-413f-942c-8a923cdef68a tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "dd78d74a-11d6-4f06-8092-5088b3fad412" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.955s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:06:37 user nova-compute[71605]: DEBUG nova.compute.manager [req-af574e9b-172f-4247-8fe1-5c5b7a6c610d req-9f188fad-2c00-4cd7-8a9d-a92f6c9c517a service nova] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Received event network-vif-plugged-14dcc4ff-4a09-446a-b0ea-d9989cd3fa16 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:06:37 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-af574e9b-172f-4247-8fe1-5c5b7a6c610d req-9f188fad-2c00-4cd7-8a9d-a92f6c9c517a service nova] Acquiring lock "dd78d74a-11d6-4f06-8092-5088b3fad412-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:06:37 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-af574e9b-172f-4247-8fe1-5c5b7a6c610d req-9f188fad-2c00-4cd7-8a9d-a92f6c9c517a service nova] Lock "dd78d74a-11d6-4f06-8092-5088b3fad412-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:06:37 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-af574e9b-172f-4247-8fe1-5c5b7a6c610d req-9f188fad-2c00-4cd7-8a9d-a92f6c9c517a service nova] Lock "dd78d74a-11d6-4f06-8092-5088b3fad412-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:06:37 user nova-compute[71605]: DEBUG nova.compute.manager [req-af574e9b-172f-4247-8fe1-5c5b7a6c610d req-9f188fad-2c00-4cd7-8a9d-a92f6c9c517a service nova] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] No waiting events found dispatching network-vif-plugged-14dcc4ff-4a09-446a-b0ea-d9989cd3fa16 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:06:37 user nova-compute[71605]: WARNING nova.compute.manager [req-af574e9b-172f-4247-8fe1-5c5b7a6c610d req-9f188fad-2c00-4cd7-8a9d-a92f6c9c517a service nova] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Received unexpected event network-vif-plugged-14dcc4ff-4a09-446a-b0ea-d9989cd3fa16 for instance with vm_state deleted and task_state None. Apr 20 16:06:37 user nova-compute[71605]: DEBUG nova.compute.manager [req-8bd8b6b6-f070-440c-905c-39183171f47c req-337af847-f5c2-4e2f-a07e-03d6374d0890 service nova] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Received event network-vif-deleted-14dcc4ff-4a09-446a-b0ea-d9989cd3fa16 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:06:37 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:37 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:38 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:38 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:38 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:38 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:38 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:39 user nova-compute[71605]: DEBUG nova.compute.manager [req-77400feb-a13c-45cb-982b-9d8130aaa1a5 req-9c3d757e-412d-4862-bdd0-8af79e635821 service nova] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Received event network-vif-plugged-64234034-3bc7-49ec-adb2-d425da7301e7 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:06:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-77400feb-a13c-45cb-982b-9d8130aaa1a5 req-9c3d757e-412d-4862-bdd0-8af79e635821 service nova] Acquiring lock "3ac0a246-e2fe-4164-9bc1-c96bb94e396f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:06:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-77400feb-a13c-45cb-982b-9d8130aaa1a5 req-9c3d757e-412d-4862-bdd0-8af79e635821 service nova] Lock "3ac0a246-e2fe-4164-9bc1-c96bb94e396f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:06:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-77400feb-a13c-45cb-982b-9d8130aaa1a5 req-9c3d757e-412d-4862-bdd0-8af79e635821 service nova] Lock "3ac0a246-e2fe-4164-9bc1-c96bb94e396f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:06:39 user nova-compute[71605]: DEBUG nova.compute.manager [req-77400feb-a13c-45cb-982b-9d8130aaa1a5 req-9c3d757e-412d-4862-bdd0-8af79e635821 service nova] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] No waiting events found dispatching network-vif-plugged-64234034-3bc7-49ec-adb2-d425da7301e7 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:06:39 user nova-compute[71605]: WARNING nova.compute.manager [req-77400feb-a13c-45cb-982b-9d8130aaa1a5 req-9c3d757e-412d-4862-bdd0-8af79e635821 service nova] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Received unexpected event network-vif-plugged-64234034-3bc7-49ec-adb2-d425da7301e7 for instance with vm_state building and task_state spawning. Apr 20 16:06:39 user nova-compute[71605]: DEBUG nova.compute.manager [req-77400feb-a13c-45cb-982b-9d8130aaa1a5 req-9c3d757e-412d-4862-bdd0-8af79e635821 service nova] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Received event network-vif-plugged-64234034-3bc7-49ec-adb2-d425da7301e7 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:06:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-77400feb-a13c-45cb-982b-9d8130aaa1a5 req-9c3d757e-412d-4862-bdd0-8af79e635821 service nova] Acquiring lock "3ac0a246-e2fe-4164-9bc1-c96bb94e396f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:06:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-77400feb-a13c-45cb-982b-9d8130aaa1a5 req-9c3d757e-412d-4862-bdd0-8af79e635821 service nova] Lock "3ac0a246-e2fe-4164-9bc1-c96bb94e396f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:06:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-77400feb-a13c-45cb-982b-9d8130aaa1a5 req-9c3d757e-412d-4862-bdd0-8af79e635821 service nova] Lock "3ac0a246-e2fe-4164-9bc1-c96bb94e396f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:06:39 user nova-compute[71605]: DEBUG nova.compute.manager [req-77400feb-a13c-45cb-982b-9d8130aaa1a5 req-9c3d757e-412d-4862-bdd0-8af79e635821 service nova] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] No waiting events found dispatching network-vif-plugged-64234034-3bc7-49ec-adb2-d425da7301e7 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:06:39 user nova-compute[71605]: WARNING nova.compute.manager [req-77400feb-a13c-45cb-982b-9d8130aaa1a5 req-9c3d757e-412d-4862-bdd0-8af79e635821 service nova] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Received unexpected event network-vif-plugged-64234034-3bc7-49ec-adb2-d425da7301e7 for instance with vm_state building and task_state spawning. Apr 20 16:06:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bbaf9613-e938-40f8-933e-790c0a707355 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Acquiring lock "91f4b3d1-0fea-4378-94e3-c2bbfd8cad81" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:06:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bbaf9613-e938-40f8-933e-790c0a707355 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Lock "91f4b3d1-0fea-4378-94e3-c2bbfd8cad81" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:06:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bbaf9613-e938-40f8-933e-790c0a707355 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Acquiring lock "91f4b3d1-0fea-4378-94e3-c2bbfd8cad81-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:06:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bbaf9613-e938-40f8-933e-790c0a707355 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Lock "91f4b3d1-0fea-4378-94e3-c2bbfd8cad81-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:06:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bbaf9613-e938-40f8-933e-790c0a707355 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Lock "91f4b3d1-0fea-4378-94e3-c2bbfd8cad81-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:06:39 user nova-compute[71605]: INFO nova.compute.manager [None req-bbaf9613-e938-40f8-933e-790c0a707355 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Terminating instance Apr 20 16:06:39 user nova-compute[71605]: DEBUG nova.compute.manager [None req-bbaf9613-e938-40f8-933e-790c0a707355 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Start destroying the instance on the hypervisor. {{(pid=71605) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 16:06:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Resumed> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:06:40 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] VM Resumed (Lifecycle Event) Apr 20 16:06:40 user nova-compute[71605]: DEBUG nova.compute.manager [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Instance event wait completed in 0 seconds for {{(pid=71605) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Guest created on hypervisor {{(pid=71605) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 16:06:40 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Instance spawned successfully. Apr 20 16:06:40 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Found default for hw_cdrom_bus of ide {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Found default for hw_disk_bus of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Found default for hw_input_bus of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Found default for hw_pointer_model of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Found default for hw_video_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Found default for hw_vif_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:06:40 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:06:40 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Started> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:06:40 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] VM Started (Lifecycle Event) Apr 20 16:06:40 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:06:40 user nova-compute[71605]: INFO nova.compute.manager [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Took 6.32 seconds to spawn the instance on the hypervisor. Apr 20 16:06:40 user nova-compute[71605]: DEBUG nova.compute.manager [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:06:40 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:06:40 user nova-compute[71605]: INFO nova.compute.manager [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Took 7.12 seconds to build instance. Apr 20 16:06:40 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-efb822b5-9e82-4796-b1ec-680a0a3ad818 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "3ac0a246-e2fe-4164-9bc1-c96bb94e396f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.215s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:40 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Instance destroyed successfully. Apr 20 16:06:40 user nova-compute[71605]: DEBUG nova.objects.instance [None req-bbaf9613-e938-40f8-933e-790c0a707355 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Lazy-loading 'resources' on Instance uuid 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-bbaf9613-e938-40f8-933e-790c0a707355 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:02:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-942605486',display_name='tempest-ServerStableDeviceRescueTest-server-942605486',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverstabledevicerescuetest-server-942605486',id=4,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCZym0vodVYP/JGy/H71EPtkiLL3pgSyqc+6Le0y9dituQzc/wfdGdwVdf4pgjAAE55MUTGyqTl0C2t2y934ULtfkrcvhTphaGXzfELzex4GVcPZlULQOFRqsadQFb89Hw==',key_name='tempest-keypair-58929071',keypairs=,launch_index=0,launched_at=2023-04-20T16:03:11Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='fbbcfeb5266f4ca6b9738b18ba7d127e',ramdisk_id='',reservation_id='r-zbg0o0ba',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerStableDeviceRescueTest-179851846',owner_user_name='tempest-ServerStableDeviceRescueTest-179851846-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T16:04:52Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='48eeb9edc18f48f0ad13c819cdac9106',uuid=91f4b3d1-0fea-4378-94e3-c2bbfd8cad81,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b2af67f0-0768-4ebc-a21b-0ef6e2b3f264", "address": "fa:16:3e:d0:3f:7b", "network": {"id": "224391e3-9d6f-4e5f-b1bb-00dd1cd0ea06", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1568684394-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.131", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fbbcfeb5266f4ca6b9738b18ba7d127e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2af67f0-07", "ovs_interfaceid": "b2af67f0-0768-4ebc-a21b-0ef6e2b3f264", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-bbaf9613-e938-40f8-933e-790c0a707355 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Converting VIF {"id": "b2af67f0-0768-4ebc-a21b-0ef6e2b3f264", "address": "fa:16:3e:d0:3f:7b", "network": {"id": "224391e3-9d6f-4e5f-b1bb-00dd1cd0ea06", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1568684394-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.131", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fbbcfeb5266f4ca6b9738b18ba7d127e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2af67f0-07", "ovs_interfaceid": "b2af67f0-0768-4ebc-a21b-0ef6e2b3f264", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-bbaf9613-e938-40f8-933e-790c0a707355 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d0:3f:7b,bridge_name='br-int',has_traffic_filtering=True,id=b2af67f0-0768-4ebc-a21b-0ef6e2b3f264,network=Network(224391e3-9d6f-4e5f-b1bb-00dd1cd0ea06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2af67f0-07') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG os_vif [None req-bbaf9613-e938-40f8-933e-790c0a707355 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d0:3f:7b,bridge_name='br-int',has_traffic_filtering=True,id=b2af67f0-0768-4ebc-a21b-0ef6e2b3f264,network=Network(224391e3-9d6f-4e5f-b1bb-00dd1cd0ea06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2af67f0-07') {{(pid=71605) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2af67f0-07, bridge=br-int, if_exists=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:06:40 user nova-compute[71605]: INFO os_vif [None req-bbaf9613-e938-40f8-933e-790c0a707355 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d0:3f:7b,bridge_name='br-int',has_traffic_filtering=True,id=b2af67f0-0768-4ebc-a21b-0ef6e2b3f264,network=Network(224391e3-9d6f-4e5f-b1bb-00dd1cd0ea06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2af67f0-07') Apr 20 16:06:40 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-bbaf9613-e938-40f8-933e-790c0a707355 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Deleting instance files /opt/stack/data/nova/instances/91f4b3d1-0fea-4378-94e3-c2bbfd8cad81_del Apr 20 16:06:40 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-bbaf9613-e938-40f8-933e-790c0a707355 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Deletion of /opt/stack/data/nova/instances/91f4b3d1-0fea-4378-94e3-c2bbfd8cad81_del complete Apr 20 16:06:40 user nova-compute[71605]: INFO nova.compute.manager [None req-bbaf9613-e938-40f8-933e-790c0a707355 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Took 0.69 seconds to destroy the instance on the hypervisor. Apr 20 16:06:40 user nova-compute[71605]: DEBUG oslo.service.loopingcall [None req-bbaf9613-e938-40f8-933e-790c0a707355 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71605) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG nova.compute.manager [-] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Deallocating network for instance {{(pid=71605) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] deallocate_for_instance() {{(pid=71605) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/disk --force-share --output=json" returned: 0 in 0.153s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:06:40 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c2b84ca2-f67b-4219-b7e6-18d2029e998a/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:06:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c2b84ca2-f67b-4219-b7e6-18d2029e998a/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:06:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c2b84ca2-f67b-4219-b7e6-18d2029e998a/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:06:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c2b84ca2-f67b-4219-b7e6-18d2029e998a/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:06:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk.rescue --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:06:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk.rescue --force-share --output=json" returned: 0 in 0.149s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:06:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk.rescue --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:06:41 user nova-compute[71605]: DEBUG nova.compute.manager [req-000593b0-9600-423d-956b-b3e47c2309d4 req-664e3ab0-2a8b-4d71-bdc5-d327ab95148f service nova] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Received event network-vif-unplugged-b2af67f0-0768-4ebc-a21b-0ef6e2b3f264 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:06:41 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-000593b0-9600-423d-956b-b3e47c2309d4 req-664e3ab0-2a8b-4d71-bdc5-d327ab95148f service nova] Acquiring lock "91f4b3d1-0fea-4378-94e3-c2bbfd8cad81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:06:41 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-000593b0-9600-423d-956b-b3e47c2309d4 req-664e3ab0-2a8b-4d71-bdc5-d327ab95148f service nova] Lock "91f4b3d1-0fea-4378-94e3-c2bbfd8cad81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:06:41 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-000593b0-9600-423d-956b-b3e47c2309d4 req-664e3ab0-2a8b-4d71-bdc5-d327ab95148f service nova] Lock "91f4b3d1-0fea-4378-94e3-c2bbfd8cad81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:06:41 user nova-compute[71605]: DEBUG nova.compute.manager [req-000593b0-9600-423d-956b-b3e47c2309d4 req-664e3ab0-2a8b-4d71-bdc5-d327ab95148f service nova] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] No waiting events found dispatching network-vif-unplugged-b2af67f0-0768-4ebc-a21b-0ef6e2b3f264 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:06:41 user nova-compute[71605]: DEBUG nova.compute.manager [req-000593b0-9600-423d-956b-b3e47c2309d4 req-664e3ab0-2a8b-4d71-bdc5-d327ab95148f service nova] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Received event network-vif-unplugged-b2af67f0-0768-4ebc-a21b-0ef6e2b3f264 for instance with task_state deleting. {{(pid=71605) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 16:06:41 user nova-compute[71605]: DEBUG nova.compute.manager [req-000593b0-9600-423d-956b-b3e47c2309d4 req-664e3ab0-2a8b-4d71-bdc5-d327ab95148f service nova] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Received event network-vif-plugged-b2af67f0-0768-4ebc-a21b-0ef6e2b3f264 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:06:41 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-000593b0-9600-423d-956b-b3e47c2309d4 req-664e3ab0-2a8b-4d71-bdc5-d327ab95148f service nova] Acquiring lock "91f4b3d1-0fea-4378-94e3-c2bbfd8cad81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:06:41 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-000593b0-9600-423d-956b-b3e47c2309d4 req-664e3ab0-2a8b-4d71-bdc5-d327ab95148f service nova] Lock "91f4b3d1-0fea-4378-94e3-c2bbfd8cad81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:06:41 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-000593b0-9600-423d-956b-b3e47c2309d4 req-664e3ab0-2a8b-4d71-bdc5-d327ab95148f service nova] Lock "91f4b3d1-0fea-4378-94e3-c2bbfd8cad81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:06:41 user nova-compute[71605]: DEBUG nova.compute.manager [req-000593b0-9600-423d-956b-b3e47c2309d4 req-664e3ab0-2a8b-4d71-bdc5-d327ab95148f service nova] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] No waiting events found dispatching network-vif-plugged-b2af67f0-0768-4ebc-a21b-0ef6e2b3f264 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:06:41 user nova-compute[71605]: WARNING nova.compute.manager [req-000593b0-9600-423d-956b-b3e47c2309d4 req-664e3ab0-2a8b-4d71-bdc5-d327ab95148f service nova] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Received unexpected event network-vif-plugged-b2af67f0-0768-4ebc-a21b-0ef6e2b3f264 for instance with vm_state active and task_state deleting. Apr 20 16:06:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk.rescue --force-share --output=json" returned: 0 in 0.170s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:06:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:06:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:06:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:06:41 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:06:41 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Took 1.29 seconds to deallocate network for instance. Apr 20 16:06:41 user nova-compute[71605]: DEBUG nova.compute.manager [req-c8923aa7-a477-4917-9b0b-3d04070ec9a2 req-151d7144-a237-409f-9d9d-f703d8782726 service nova] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Received event network-vif-deleted-b2af67f0-0768-4ebc-a21b-0ef6e2b3f264 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:06:41 user nova-compute[71605]: INFO nova.compute.manager [req-c8923aa7-a477-4917-9b0b-3d04070ec9a2 req-151d7144-a237-409f-9d9d-f703d8782726 service nova] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Neutron deleted interface b2af67f0-0768-4ebc-a21b-0ef6e2b3f264; detaching it from the instance and deleting it from the info cache Apr 20 16:06:41 user nova-compute[71605]: DEBUG nova.network.neutron [req-c8923aa7-a477-4917-9b0b-3d04070ec9a2 req-151d7144-a237-409f-9d9d-f703d8782726 service nova] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:06:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk --force-share --output=json" returned: 0 in 0.171s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:06:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a145fb51-4ca5-4cc4-b8bd-cd3665bef473/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:06:41 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bbaf9613-e938-40f8-933e-790c0a707355 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:06:41 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bbaf9613-e938-40f8-933e-790c0a707355 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.003s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:06:41 user nova-compute[71605]: DEBUG nova.compute.manager [req-c8923aa7-a477-4917-9b0b-3d04070ec9a2 req-151d7144-a237-409f-9d9d-f703d8782726 service nova] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Detach interface failed, port_id=b2af67f0-0768-4ebc-a21b-0ef6e2b3f264, reason: Instance 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81 could not be found. {{(pid=71605) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 20 16:06:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a145fb51-4ca5-4cc4-b8bd-cd3665bef473/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:06:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a145fb51-4ca5-4cc4-b8bd-cd3665bef473/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:06:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a145fb51-4ca5-4cc4-b8bd-cd3665bef473/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:06:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:06:42 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-bbaf9613-e938-40f8-933e-790c0a707355 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:06:42 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-bbaf9613-e938-40f8-933e-790c0a707355 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:06:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bbaf9613-e938-40f8-933e-790c0a707355 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.393s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:06:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:06:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:06:42 user nova-compute[71605]: INFO nova.scheduler.client.report [None req-bbaf9613-e938-40f8-933e-790c0a707355 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Deleted allocations for instance 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81 Apr 20 16:06:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bbaf9613-e938-40f8-933e-790c0a707355 tempest-ServerStableDeviceRescueTest-179851846 tempest-ServerStableDeviceRescueTest-179851846-project-member] Lock "91f4b3d1-0fea-4378-94e3-c2bbfd8cad81" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.609s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:06:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json" returned: 0 in 0.163s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:06:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dc918ed4-8bc6-4a4f-a189-d6cdd5817854/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:06:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dc918ed4-8bc6-4a4f-a189-d6cdd5817854/disk --force-share --output=json" returned: 0 in 0.152s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:06:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dc918ed4-8bc6-4a4f-a189-d6cdd5817854/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:06:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dc918ed4-8bc6-4a4f-a189-d6cdd5817854/disk --force-share --output=json" returned: 0 in 0.154s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:06:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3ac0a246-e2fe-4164-9bc1-c96bb94e396f/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:06:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3ac0a246-e2fe-4164-9bc1-c96bb94e396f/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:06:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3ac0a246-e2fe-4164-9bc1-c96bb94e396f/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:06:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3ac0a246-e2fe-4164-9bc1-c96bb94e396f/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:06:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:06:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:06:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:06:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:06:43 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Error from libvirt while getting description of instance-00000004: [Error Code 42] Domain not found: no domain with matching uuid '91f4b3d1-0fea-4378-94e3-c2bbfd8cad81' (instance-00000004): libvirt.libvirtError: Domain not found: no domain with matching uuid '91f4b3d1-0fea-4378-94e3-c2bbfd8cad81' (instance-00000004) Apr 20 16:06:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a760987f-1a65-4e42-8cef-73db9ef2db48/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:06:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a760987f-1a65-4e42-8cef-73db9ef2db48/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:06:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a760987f-1a65-4e42-8cef-73db9ef2db48/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:06:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a760987f-1a65-4e42-8cef-73db9ef2db48/disk --force-share --output=json" returned: 0 in 0.165s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:06:44 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:06:44 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:06:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=7915MB free_disk=26.252094268798828GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:06:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:06:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:06:44 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance d4ea4d29-b178-4da2-b971-76f97031b244 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:06:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance e8f62d46-e2dc-4870-adf1-f62d88bb653b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:06:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance fe0bde76-a4f8-4865-91af-2bd3790587a7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:06:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance dc918ed4-8bc6-4a4f-a189-d6cdd5817854 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:06:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance c2b84ca2-f67b-4219-b7e6-18d2029e998a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:06:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance a760987f-1a65-4e42-8cef-73db9ef2db48 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:06:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance a145fb51-4ca5-4cc4-b8bd-cd3665bef473 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:06:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance 15d42ba7-cf47-4374-83b5-06d5242951b7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:06:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance 3ac0a246-e2fe-4164-9bc1-c96bb94e396f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:06:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 9 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:06:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=1664MB phys_disk=40GB used_disk=9GB total_vcpus=12 used_vcpus=9 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:06:44 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:06:44 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:06:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:06:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.517s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:06:45 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:45 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:06:45 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:06:45 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:06:45 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:06:45 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Didn't find any instances for network info cache update. {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 20 16:06:45 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:06:45 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:06:45 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:06:45 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:06:45 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:06:45 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:06:50 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:06:50 user nova-compute[71605]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:06:50 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] VM Stopped (Lifecycle Event) Apr 20 16:06:50 user nova-compute[71605]: DEBUG nova.compute.manager [None req-a1a759d1-40c6-4f33-8af1-d757851d8b9b None None] [instance: dd78d74a-11d6-4f06-8092-5088b3fad412] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:06:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:55 user nova-compute[71605]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:06:55 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] VM Stopped (Lifecycle Event) Apr 20 16:06:55 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f187f242-06e0-420c-ab89-45dc47006de3 None None] [instance: 91f4b3d1-0fea-4378-94e3-c2bbfd8cad81] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:06:55 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:06:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:00 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:05 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:07:09 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:10 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:14 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:15 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:20 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:25 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:29 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:30 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:30 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:32 user nova-compute[71605]: DEBUG nova.compute.manager [req-5ae99e58-4429-4703-822e-3fd559188962 req-b9937401-bdc1-4695-b021-c6b70f63b5e6 service nova] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Received event network-changed-9e814c79-86f6-46ce-9473-d87fb7e67641 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:07:32 user nova-compute[71605]: DEBUG nova.compute.manager [req-5ae99e58-4429-4703-822e-3fd559188962 req-b9937401-bdc1-4695-b021-c6b70f63b5e6 service nova] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Refreshing instance network info cache due to event network-changed-9e814c79-86f6-46ce-9473-d87fb7e67641. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:07:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-5ae99e58-4429-4703-822e-3fd559188962 req-b9937401-bdc1-4695-b021-c6b70f63b5e6 service nova] Acquiring lock "refresh_cache-c2b84ca2-f67b-4219-b7e6-18d2029e998a" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:07:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-5ae99e58-4429-4703-822e-3fd559188962 req-b9937401-bdc1-4695-b021-c6b70f63b5e6 service nova] Acquired lock "refresh_cache-c2b84ca2-f67b-4219-b7e6-18d2029e998a" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:07:32 user nova-compute[71605]: DEBUG nova.network.neutron [req-5ae99e58-4429-4703-822e-3fd559188962 req-b9937401-bdc1-4695-b021-c6b70f63b5e6 service nova] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Refreshing network info cache for port 9e814c79-86f6-46ce-9473-d87fb7e67641 {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:07:33 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:33 user nova-compute[71605]: DEBUG nova.network.neutron [req-5ae99e58-4429-4703-822e-3fd559188962 req-b9937401-bdc1-4695-b021-c6b70f63b5e6 service nova] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Updated VIF entry in instance network info cache for port 9e814c79-86f6-46ce-9473-d87fb7e67641. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:07:33 user nova-compute[71605]: DEBUG nova.network.neutron [req-5ae99e58-4429-4703-822e-3fd559188962 req-b9937401-bdc1-4695-b021-c6b70f63b5e6 service nova] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Updating instance_info_cache with network_info: [{"id": "9e814c79-86f6-46ce-9473-d87fb7e67641", "address": "fa:16:3e:24:4e:12", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.79", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e814c79-86", "ovs_interfaceid": "9e814c79-86f6-46ce-9473-d87fb7e67641", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:07:33 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-5ae99e58-4429-4703-822e-3fd559188962 req-b9937401-bdc1-4695-b021-c6b70f63b5e6 service nova] Releasing lock "refresh_cache-c2b84ca2-f67b-4219-b7e6-18d2029e998a" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:07:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-86905471-bdaa-483d-bf8a-62aba094cbca tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquiring lock "c2b84ca2-f67b-4219-b7e6-18d2029e998a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:07:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-86905471-bdaa-483d-bf8a-62aba094cbca tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "c2b84ca2-f67b-4219-b7e6-18d2029e998a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:07:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-86905471-bdaa-483d-bf8a-62aba094cbca tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquiring lock "c2b84ca2-f67b-4219-b7e6-18d2029e998a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:07:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-86905471-bdaa-483d-bf8a-62aba094cbca tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "c2b84ca2-f67b-4219-b7e6-18d2029e998a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:07:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-86905471-bdaa-483d-bf8a-62aba094cbca tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "c2b84ca2-f67b-4219-b7e6-18d2029e998a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:07:34 user nova-compute[71605]: INFO nova.compute.manager [None req-86905471-bdaa-483d-bf8a-62aba094cbca tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Terminating instance Apr 20 16:07:34 user nova-compute[71605]: DEBUG nova.compute.manager [None req-86905471-bdaa-483d-bf8a-62aba094cbca tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Start destroying the instance on the hypervisor. {{(pid=71605) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 16:07:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:34 user nova-compute[71605]: DEBUG nova.compute.manager [req-988efc28-1879-4161-8fff-25a792c55ed0 req-31acc249-6a46-4177-8e57-e9f573db05ee service nova] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Received event network-vif-unplugged-9e814c79-86f6-46ce-9473-d87fb7e67641 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:07:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-988efc28-1879-4161-8fff-25a792c55ed0 req-31acc249-6a46-4177-8e57-e9f573db05ee service nova] Acquiring lock "c2b84ca2-f67b-4219-b7e6-18d2029e998a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:07:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-988efc28-1879-4161-8fff-25a792c55ed0 req-31acc249-6a46-4177-8e57-e9f573db05ee service nova] Lock "c2b84ca2-f67b-4219-b7e6-18d2029e998a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:07:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-988efc28-1879-4161-8fff-25a792c55ed0 req-31acc249-6a46-4177-8e57-e9f573db05ee service nova] Lock "c2b84ca2-f67b-4219-b7e6-18d2029e998a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:07:34 user nova-compute[71605]: DEBUG nova.compute.manager [req-988efc28-1879-4161-8fff-25a792c55ed0 req-31acc249-6a46-4177-8e57-e9f573db05ee service nova] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] No waiting events found dispatching network-vif-unplugged-9e814c79-86f6-46ce-9473-d87fb7e67641 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:07:34 user nova-compute[71605]: DEBUG nova.compute.manager [req-988efc28-1879-4161-8fff-25a792c55ed0 req-31acc249-6a46-4177-8e57-e9f573db05ee service nova] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Received event network-vif-unplugged-9e814c79-86f6-46ce-9473-d87fb7e67641 for instance with task_state deleting. {{(pid=71605) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 16:07:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:35 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Instance destroyed successfully. Apr 20 16:07:35 user nova-compute[71605]: DEBUG nova.objects.instance [None req-86905471-bdaa-483d-bf8a-62aba094cbca tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lazy-loading 'resources' on Instance uuid c2b84ca2-f67b-4219-b7e6-18d2029e998a {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:07:35 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-86905471-bdaa-483d-bf8a-62aba094cbca tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:05:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1498143817',display_name='tempest-AttachVolumeNegativeTest-server-1498143817',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1498143817',id=12,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM/e6+xMI0YH6Nw89h/OWSpnukYpq8WmC7TXM/G8CwHs84ixak8UdfgaeBRkeLKS6hBuTod5w5YIWjrhnSQwR7L2FaQ72Z5mCu+hRUU2g4pFa5raukmqUiXrVuyvOpMkNQ==',key_name='tempest-keypair-1502137074',keypairs=,launch_index=0,launched_at=2023-04-20T16:05:54Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='71cf2664111f45788d24092e8ceede9c',ramdisk_id='',reservation_id='r-ev0yb61c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-308436039',owner_user_name='tempest-AttachVolumeNegativeTest-308436039-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T16:05:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='690c49feae904687826fb959ba5ba283',uuid=c2b84ca2-f67b-4219-b7e6-18d2029e998a,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9e814c79-86f6-46ce-9473-d87fb7e67641", "address": "fa:16:3e:24:4e:12", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.79", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e814c79-86", "ovs_interfaceid": "9e814c79-86f6-46ce-9473-d87fb7e67641", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 16:07:35 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-86905471-bdaa-483d-bf8a-62aba094cbca tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Converting VIF {"id": "9e814c79-86f6-46ce-9473-d87fb7e67641", "address": "fa:16:3e:24:4e:12", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.79", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e814c79-86", "ovs_interfaceid": "9e814c79-86f6-46ce-9473-d87fb7e67641", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:07:35 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-86905471-bdaa-483d-bf8a-62aba094cbca tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:24:4e:12,bridge_name='br-int',has_traffic_filtering=True,id=9e814c79-86f6-46ce-9473-d87fb7e67641,network=Network(2dc9b3da-0124-4718-9f70-a131cd030480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e814c79-86') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:07:35 user nova-compute[71605]: DEBUG os_vif [None req-86905471-bdaa-483d-bf8a-62aba094cbca tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:4e:12,bridge_name='br-int',has_traffic_filtering=True,id=9e814c79-86f6-46ce-9473-d87fb7e67641,network=Network(2dc9b3da-0124-4718-9f70-a131cd030480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e814c79-86') {{(pid=71605) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 16:07:35 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:35 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9e814c79-86, bridge=br-int, if_exists=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:07:35 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:35 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:07:35 user nova-compute[71605]: INFO os_vif [None req-86905471-bdaa-483d-bf8a-62aba094cbca tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:4e:12,bridge_name='br-int',has_traffic_filtering=True,id=9e814c79-86f6-46ce-9473-d87fb7e67641,network=Network(2dc9b3da-0124-4718-9f70-a131cd030480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e814c79-86') Apr 20 16:07:35 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-86905471-bdaa-483d-bf8a-62aba094cbca tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Deleting instance files /opt/stack/data/nova/instances/c2b84ca2-f67b-4219-b7e6-18d2029e998a_del Apr 20 16:07:35 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-86905471-bdaa-483d-bf8a-62aba094cbca tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Deletion of /opt/stack/data/nova/instances/c2b84ca2-f67b-4219-b7e6-18d2029e998a_del complete Apr 20 16:07:35 user nova-compute[71605]: DEBUG nova.compute.manager [req-c50b329e-0855-445d-83fb-5459e5dd58b4 req-d1ab1c53-96f6-44d2-8883-156df888f0eb service nova] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Received event network-changed-cac4dfaa-510a-4330-b9b1-aeb25f57abef {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:07:35 user nova-compute[71605]: DEBUG nova.compute.manager [req-c50b329e-0855-445d-83fb-5459e5dd58b4 req-d1ab1c53-96f6-44d2-8883-156df888f0eb service nova] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Refreshing instance network info cache due to event network-changed-cac4dfaa-510a-4330-b9b1-aeb25f57abef. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:07:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-c50b329e-0855-445d-83fb-5459e5dd58b4 req-d1ab1c53-96f6-44d2-8883-156df888f0eb service nova] Acquiring lock "refresh_cache-a760987f-1a65-4e42-8cef-73db9ef2db48" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:07:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-c50b329e-0855-445d-83fb-5459e5dd58b4 req-d1ab1c53-96f6-44d2-8883-156df888f0eb service nova] Acquired lock "refresh_cache-a760987f-1a65-4e42-8cef-73db9ef2db48" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:07:35 user nova-compute[71605]: DEBUG nova.network.neutron [req-c50b329e-0855-445d-83fb-5459e5dd58b4 req-d1ab1c53-96f6-44d2-8883-156df888f0eb service nova] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Refreshing network info cache for port cac4dfaa-510a-4330-b9b1-aeb25f57abef {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:07:35 user nova-compute[71605]: INFO nova.compute.manager [None req-86905471-bdaa-483d-bf8a-62aba094cbca tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Took 0.69 seconds to destroy the instance on the hypervisor. Apr 20 16:07:35 user nova-compute[71605]: DEBUG oslo.service.loopingcall [None req-86905471-bdaa-483d-bf8a-62aba094cbca tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71605) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 16:07:35 user nova-compute[71605]: DEBUG nova.compute.manager [-] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Deallocating network for instance {{(pid=71605) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 16:07:35 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] deallocate_for_instance() {{(pid=71605) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 16:07:36 user nova-compute[71605]: DEBUG nova.network.neutron [req-c50b329e-0855-445d-83fb-5459e5dd58b4 req-d1ab1c53-96f6-44d2-8883-156df888f0eb service nova] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Updated VIF entry in instance network info cache for port cac4dfaa-510a-4330-b9b1-aeb25f57abef. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:07:36 user nova-compute[71605]: DEBUG nova.network.neutron [req-c50b329e-0855-445d-83fb-5459e5dd58b4 req-d1ab1c53-96f6-44d2-8883-156df888f0eb service nova] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Updating instance_info_cache with network_info: [{"id": "cac4dfaa-510a-4330-b9b1-aeb25f57abef", "address": "fa:16:3e:97:a1:b9", "network": {"id": "545a57d8-9d55-4ace-a0ad-635d7bc0ae52", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1085059550-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.170", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0a5eb3796a4d3a871843f409c6ffbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcac4dfaa-51", "ovs_interfaceid": "cac4dfaa-510a-4330-b9b1-aeb25f57abef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:07:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-c50b329e-0855-445d-83fb-5459e5dd58b4 req-d1ab1c53-96f6-44d2-8883-156df888f0eb service nova] Releasing lock "refresh_cache-a760987f-1a65-4e42-8cef-73db9ef2db48" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:07:36 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:07:36 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Took 1.49 seconds to deallocate network for instance. Apr 20 16:07:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-86905471-bdaa-483d-bf8a-62aba094cbca tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:07:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-86905471-bdaa-483d-bf8a-62aba094cbca tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:07:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c2e831f2-a9c6-43e1-a22e-5a7582a49c42 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Acquiring lock "a760987f-1a65-4e42-8cef-73db9ef2db48" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:07:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c2e831f2-a9c6-43e1-a22e-5a7582a49c42 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "a760987f-1a65-4e42-8cef-73db9ef2db48" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:07:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c2e831f2-a9c6-43e1-a22e-5a7582a49c42 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Acquiring lock "a760987f-1a65-4e42-8cef-73db9ef2db48-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:07:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c2e831f2-a9c6-43e1-a22e-5a7582a49c42 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "a760987f-1a65-4e42-8cef-73db9ef2db48-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:07:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c2e831f2-a9c6-43e1-a22e-5a7582a49c42 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "a760987f-1a65-4e42-8cef-73db9ef2db48-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:07:36 user nova-compute[71605]: INFO nova.compute.manager [None req-c2e831f2-a9c6-43e1-a22e-5a7582a49c42 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Terminating instance Apr 20 16:07:36 user nova-compute[71605]: DEBUG nova.compute.manager [None req-c2e831f2-a9c6-43e1-a22e-5a7582a49c42 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Start destroying the instance on the hypervisor. {{(pid=71605) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 16:07:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:36 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-86905471-bdaa-483d-bf8a-62aba094cbca tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:07:36 user nova-compute[71605]: DEBUG nova.compute.manager [req-92d782e9-21ff-4ad5-802d-8ba4c93e26cd req-cbbba103-61c1-4a9f-828a-a9873909bca2 service nova] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Received event network-vif-plugged-9e814c79-86f6-46ce-9473-d87fb7e67641 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:07:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-92d782e9-21ff-4ad5-802d-8ba4c93e26cd req-cbbba103-61c1-4a9f-828a-a9873909bca2 service nova] Acquiring lock "c2b84ca2-f67b-4219-b7e6-18d2029e998a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:07:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-92d782e9-21ff-4ad5-802d-8ba4c93e26cd req-cbbba103-61c1-4a9f-828a-a9873909bca2 service nova] Lock "c2b84ca2-f67b-4219-b7e6-18d2029e998a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:07:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-92d782e9-21ff-4ad5-802d-8ba4c93e26cd req-cbbba103-61c1-4a9f-828a-a9873909bca2 service nova] Lock "c2b84ca2-f67b-4219-b7e6-18d2029e998a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:07:36 user nova-compute[71605]: DEBUG nova.compute.manager [req-92d782e9-21ff-4ad5-802d-8ba4c93e26cd req-cbbba103-61c1-4a9f-828a-a9873909bca2 service nova] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] No waiting events found dispatching network-vif-plugged-9e814c79-86f6-46ce-9473-d87fb7e67641 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:07:36 user nova-compute[71605]: WARNING nova.compute.manager [req-92d782e9-21ff-4ad5-802d-8ba4c93e26cd req-cbbba103-61c1-4a9f-828a-a9873909bca2 service nova] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Received unexpected event network-vif-plugged-9e814c79-86f6-46ce-9473-d87fb7e67641 for instance with vm_state deleted and task_state None. Apr 20 16:07:36 user nova-compute[71605]: DEBUG nova.compute.manager [req-92d782e9-21ff-4ad5-802d-8ba4c93e26cd req-cbbba103-61c1-4a9f-828a-a9873909bca2 service nova] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Received event network-vif-deleted-9e814c79-86f6-46ce-9473-d87fb7e67641 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:07:36 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-86905471-bdaa-483d-bf8a-62aba094cbca tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:07:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:37 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-86905471-bdaa-483d-bf8a-62aba094cbca tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.324s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:07:37 user nova-compute[71605]: INFO nova.scheduler.client.report [None req-86905471-bdaa-483d-bf8a-62aba094cbca tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Deleted allocations for instance c2b84ca2-f67b-4219-b7e6-18d2029e998a Apr 20 16:07:37 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-86905471-bdaa-483d-bf8a-62aba094cbca tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "c2b84ca2-f67b-4219-b7e6-18d2029e998a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.722s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:07:37 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:37 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:37 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Instance destroyed successfully. Apr 20 16:07:37 user nova-compute[71605]: DEBUG nova.objects.instance [None req-c2e831f2-a9c6-43e1-a22e-5a7582a49c42 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lazy-loading 'resources' on Instance uuid a760987f-1a65-4e42-8cef-73db9ef2db48 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:07:37 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-c2e831f2-a9c6-43e1-a22e-5a7582a49c42 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:05:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-924389841',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-924389841',id=13,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDVMEOOD2DPBjhKcYiA5lmZjVYxh9PWLGO75MzhXO3aLsn0kBvkh5hqWzAscvsUYLQELbD8L/orvsrJrdTwkd7/EBmpsdlVzjqkj4vcLr/kYQYhKCohu26BkQL4kIIGz1A==',key_name='tempest-keypair-297016939',keypairs=,launch_index=0,launched_at=2023-04-20T16:05:56Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='cb0a5eb3796a4d3a871843f409c6ffbd',ramdisk_id='',reservation_id='r-xxotsh8t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeShelveTestJSON-1118127371',owner_user_name='tempest-AttachVolumeShelveTestJSON-1118127371-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T16:05:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='f50dbce30f294bb0ba6bc2811025835d',uuid=a760987f-1a65-4e42-8cef-73db9ef2db48,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cac4dfaa-510a-4330-b9b1-aeb25f57abef", "address": "fa:16:3e:97:a1:b9", "network": {"id": "545a57d8-9d55-4ace-a0ad-635d7bc0ae52", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1085059550-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.170", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0a5eb3796a4d3a871843f409c6ffbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcac4dfaa-51", "ovs_interfaceid": "cac4dfaa-510a-4330-b9b1-aeb25f57abef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 16:07:37 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-c2e831f2-a9c6-43e1-a22e-5a7582a49c42 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Converting VIF {"id": "cac4dfaa-510a-4330-b9b1-aeb25f57abef", "address": "fa:16:3e:97:a1:b9", "network": {"id": "545a57d8-9d55-4ace-a0ad-635d7bc0ae52", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1085059550-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.170", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "cb0a5eb3796a4d3a871843f409c6ffbd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapcac4dfaa-51", "ovs_interfaceid": "cac4dfaa-510a-4330-b9b1-aeb25f57abef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:07:37 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-c2e831f2-a9c6-43e1-a22e-5a7582a49c42 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:97:a1:b9,bridge_name='br-int',has_traffic_filtering=True,id=cac4dfaa-510a-4330-b9b1-aeb25f57abef,network=Network(545a57d8-9d55-4ace-a0ad-635d7bc0ae52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcac4dfaa-51') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:07:37 user nova-compute[71605]: DEBUG os_vif [None req-c2e831f2-a9c6-43e1-a22e-5a7582a49c42 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:97:a1:b9,bridge_name='br-int',has_traffic_filtering=True,id=cac4dfaa-510a-4330-b9b1-aeb25f57abef,network=Network(545a57d8-9d55-4ace-a0ad-635d7bc0ae52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcac4dfaa-51') {{(pid=71605) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 16:07:37 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:37 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcac4dfaa-51, bridge=br-int, if_exists=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:07:37 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:37 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:37 user nova-compute[71605]: INFO os_vif [None req-c2e831f2-a9c6-43e1-a22e-5a7582a49c42 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:97:a1:b9,bridge_name='br-int',has_traffic_filtering=True,id=cac4dfaa-510a-4330-b9b1-aeb25f57abef,network=Network(545a57d8-9d55-4ace-a0ad-635d7bc0ae52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcac4dfaa-51') Apr 20 16:07:37 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-c2e831f2-a9c6-43e1-a22e-5a7582a49c42 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Deleting instance files /opt/stack/data/nova/instances/a760987f-1a65-4e42-8cef-73db9ef2db48_del Apr 20 16:07:37 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-c2e831f2-a9c6-43e1-a22e-5a7582a49c42 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Deletion of /opt/stack/data/nova/instances/a760987f-1a65-4e42-8cef-73db9ef2db48_del complete Apr 20 16:07:37 user nova-compute[71605]: INFO nova.compute.manager [None req-c2e831f2-a9c6-43e1-a22e-5a7582a49c42 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Took 0.86 seconds to destroy the instance on the hypervisor. Apr 20 16:07:37 user nova-compute[71605]: DEBUG oslo.service.loopingcall [None req-c2e831f2-a9c6-43e1-a22e-5a7582a49c42 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71605) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 16:07:37 user nova-compute[71605]: DEBUG nova.compute.manager [-] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Deallocating network for instance {{(pid=71605) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 16:07:37 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] deallocate_for_instance() {{(pid=71605) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 16:07:38 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:07:38 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Took 0.81 seconds to deallocate network for instance. Apr 20 16:07:38 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c2e831f2-a9c6-43e1-a22e-5a7582a49c42 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:07:38 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c2e831f2-a9c6-43e1-a22e-5a7582a49c42 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:07:38 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-c2e831f2-a9c6-43e1-a22e-5a7582a49c42 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:07:38 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-c2e831f2-a9c6-43e1-a22e-5a7582a49c42 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:07:38 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c2e831f2-a9c6-43e1-a22e-5a7582a49c42 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.260s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:07:38 user nova-compute[71605]: INFO nova.scheduler.client.report [None req-c2e831f2-a9c6-43e1-a22e-5a7582a49c42 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Deleted allocations for instance a760987f-1a65-4e42-8cef-73db9ef2db48 Apr 20 16:07:38 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c2e831f2-a9c6-43e1-a22e-5a7582a49c42 tempest-AttachVolumeShelveTestJSON-1118127371 tempest-AttachVolumeShelveTestJSON-1118127371-project-member] Lock "a760987f-1a65-4e42-8cef-73db9ef2db48" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.106s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:07:39 user nova-compute[71605]: DEBUG nova.compute.manager [req-65160bf0-588a-407d-ac76-974155d4a759 req-67601db3-8e54-4f13-88aa-248352f756ab service nova] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Received event network-vif-unplugged-cac4dfaa-510a-4330-b9b1-aeb25f57abef {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:07:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-65160bf0-588a-407d-ac76-974155d4a759 req-67601db3-8e54-4f13-88aa-248352f756ab service nova] Acquiring lock "a760987f-1a65-4e42-8cef-73db9ef2db48-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:07:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-65160bf0-588a-407d-ac76-974155d4a759 req-67601db3-8e54-4f13-88aa-248352f756ab service nova] Lock "a760987f-1a65-4e42-8cef-73db9ef2db48-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:07:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-65160bf0-588a-407d-ac76-974155d4a759 req-67601db3-8e54-4f13-88aa-248352f756ab service nova] Lock "a760987f-1a65-4e42-8cef-73db9ef2db48-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:07:39 user nova-compute[71605]: DEBUG nova.compute.manager [req-65160bf0-588a-407d-ac76-974155d4a759 req-67601db3-8e54-4f13-88aa-248352f756ab service nova] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] No waiting events found dispatching network-vif-unplugged-cac4dfaa-510a-4330-b9b1-aeb25f57abef {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:07:39 user nova-compute[71605]: WARNING nova.compute.manager [req-65160bf0-588a-407d-ac76-974155d4a759 req-67601db3-8e54-4f13-88aa-248352f756ab service nova] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Received unexpected event network-vif-unplugged-cac4dfaa-510a-4330-b9b1-aeb25f57abef for instance with vm_state deleted and task_state None. Apr 20 16:07:39 user nova-compute[71605]: DEBUG nova.compute.manager [req-65160bf0-588a-407d-ac76-974155d4a759 req-67601db3-8e54-4f13-88aa-248352f756ab service nova] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Received event network-vif-plugged-cac4dfaa-510a-4330-b9b1-aeb25f57abef {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:07:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-65160bf0-588a-407d-ac76-974155d4a759 req-67601db3-8e54-4f13-88aa-248352f756ab service nova] Acquiring lock "a760987f-1a65-4e42-8cef-73db9ef2db48-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:07:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-65160bf0-588a-407d-ac76-974155d4a759 req-67601db3-8e54-4f13-88aa-248352f756ab service nova] Lock "a760987f-1a65-4e42-8cef-73db9ef2db48-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:07:39 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-65160bf0-588a-407d-ac76-974155d4a759 req-67601db3-8e54-4f13-88aa-248352f756ab service nova] Lock "a760987f-1a65-4e42-8cef-73db9ef2db48-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:07:39 user nova-compute[71605]: DEBUG nova.compute.manager [req-65160bf0-588a-407d-ac76-974155d4a759 req-67601db3-8e54-4f13-88aa-248352f756ab service nova] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] No waiting events found dispatching network-vif-plugged-cac4dfaa-510a-4330-b9b1-aeb25f57abef {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:07:39 user nova-compute[71605]: WARNING nova.compute.manager [req-65160bf0-588a-407d-ac76-974155d4a759 req-67601db3-8e54-4f13-88aa-248352f756ab service nova] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Received unexpected event network-vif-plugged-cac4dfaa-510a-4330-b9b1-aeb25f57abef for instance with vm_state deleted and task_state None. Apr 20 16:07:39 user nova-compute[71605]: DEBUG nova.compute.manager [req-65160bf0-588a-407d-ac76-974155d4a759 req-67601db3-8e54-4f13-88aa-248352f756ab service nova] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Received event network-vif-deleted-cac4dfaa-510a-4330-b9b1-aeb25f57abef {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:07:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:41 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:07:41 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:07:41 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:07:41 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:07:41 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:07:41 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:07:41 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:07:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:07:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:07:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:07:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:07:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk.rescue --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:07:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk.rescue --force-share --output=json" returned: 0 in 0.134s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:07:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk.rescue --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:07:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk.rescue --force-share --output=json" returned: 0 in 0.142s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:07:41 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-453e3070-d61a-46f9-8e30-0d71853f9e2d tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Acquiring lock "a145fb51-4ca5-4cc4-b8bd-cd3665bef473" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-453e3070-d61a-46f9-8e30-0d71853f9e2d tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "a145fb51-4ca5-4cc4-b8bd-cd3665bef473" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-453e3070-d61a-46f9-8e30-0d71853f9e2d tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Acquiring lock "a145fb51-4ca5-4cc4-b8bd-cd3665bef473-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-453e3070-d61a-46f9-8e30-0d71853f9e2d tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "a145fb51-4ca5-4cc4-b8bd-cd3665bef473-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-453e3070-d61a-46f9-8e30-0d71853f9e2d tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "a145fb51-4ca5-4cc4-b8bd-cd3665bef473-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:07:42 user nova-compute[71605]: INFO nova.compute.manager [None req-453e3070-d61a-46f9-8e30-0d71853f9e2d tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Terminating instance Apr 20 16:07:42 user nova-compute[71605]: DEBUG nova.compute.manager [None req-453e3070-d61a-46f9-8e30-0d71853f9e2d tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Start destroying the instance on the hypervisor. {{(pid=71605) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7/disk --force-share --output=json" returned: 0 in 0.153s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a145fb51-4ca5-4cc4-b8bd-cd3665bef473/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG nova.compute.manager [req-1437160b-a3c4-47e3-8457-d035ccca5842 req-b6f1c217-85c0-4b6a-bcdf-0f107f752fc0 service nova] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Received event network-vif-unplugged-989ee5cd-ff10-4bcc-9b11-017b23299187 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-1437160b-a3c4-47e3-8457-d035ccca5842 req-b6f1c217-85c0-4b6a-bcdf-0f107f752fc0 service nova] Acquiring lock "a145fb51-4ca5-4cc4-b8bd-cd3665bef473-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-1437160b-a3c4-47e3-8457-d035ccca5842 req-b6f1c217-85c0-4b6a-bcdf-0f107f752fc0 service nova] Lock "a145fb51-4ca5-4cc4-b8bd-cd3665bef473-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-1437160b-a3c4-47e3-8457-d035ccca5842 req-b6f1c217-85c0-4b6a-bcdf-0f107f752fc0 service nova] Lock "a145fb51-4ca5-4cc4-b8bd-cd3665bef473-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG nova.compute.manager [req-1437160b-a3c4-47e3-8457-d035ccca5842 req-b6f1c217-85c0-4b6a-bcdf-0f107f752fc0 service nova] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] No waiting events found dispatching network-vif-unplugged-989ee5cd-ff10-4bcc-9b11-017b23299187 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG nova.compute.manager [req-1437160b-a3c4-47e3-8457-d035ccca5842 req-b6f1c217-85c0-4b6a-bcdf-0f107f752fc0 service nova] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Received event network-vif-unplugged-989ee5cd-ff10-4bcc-9b11-017b23299187 for instance with task_state deleting. {{(pid=71605) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a145fb51-4ca5-4cc4-b8bd-cd3665bef473/disk --force-share --output=json" returned: 0 in 0.154s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a145fb51-4ca5-4cc4-b8bd-cd3665bef473/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG nova.compute.manager [req-855c0ac5-d780-4a66-989e-b222977f642c req-22afe9b2-96fd-487c-bbe8-5c18206f39a6 service nova] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Received event network-changed-e068d7e5-dc70-4b18-8dd6-5726f7a3bc84 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG nova.compute.manager [req-855c0ac5-d780-4a66-989e-b222977f642c req-22afe9b2-96fd-487c-bbe8-5c18206f39a6 service nova] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Refreshing instance network info cache due to event network-changed-e068d7e5-dc70-4b18-8dd6-5726f7a3bc84. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-855c0ac5-d780-4a66-989e-b222977f642c req-22afe9b2-96fd-487c-bbe8-5c18206f39a6 service nova] Acquiring lock "refresh_cache-15d42ba7-cf47-4374-83b5-06d5242951b7" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-855c0ac5-d780-4a66-989e-b222977f642c req-22afe9b2-96fd-487c-bbe8-5c18206f39a6 service nova] Acquired lock "refresh_cache-15d42ba7-cf47-4374-83b5-06d5242951b7" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG nova.network.neutron [req-855c0ac5-d780-4a66-989e-b222977f642c req-22afe9b2-96fd-487c-bbe8-5c18206f39a6 service nova] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Refreshing network info cache for port e068d7e5-dc70-4b18-8dd6-5726f7a3bc84 {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/a145fb51-4ca5-4cc4-b8bd-cd3665bef473/disk --force-share --output=json" returned: 0 in 0.176s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:07:42 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Instance destroyed successfully. Apr 20 16:07:42 user nova-compute[71605]: DEBUG nova.objects.instance [None req-453e3070-d61a-46f9-8e30-0d71853f9e2d tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lazy-loading 'resources' on Instance uuid a145fb51-4ca5-4cc4-b8bd-cd3665bef473 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-453e3070-d61a-46f9-8e30-0d71853f9e2d tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:05:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1165125210',display_name='tempest-VolumesAdminNegativeTest-server-1165125210',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-1165125210',id=14,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-20T16:06:01Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='a92cea9e1182477ca669c506b42eda60',ramdisk_id='',reservation_id='r-f72ecp4q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesAdminNegativeTest-978356230',owner_user_name='tempest-VolumesAdminNegativeTest-978356230-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T16:06:02Z,user_data=None,user_id='c92692a1d38b4531a4e7f42660a54c7b',uuid=a145fb51-4ca5-4cc4-b8bd-cd3665bef473,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "989ee5cd-ff10-4bcc-9b11-017b23299187", "address": "fa:16:3e:92:5b:74", "network": {"id": "40132b20-6bfd-4f5a-8f6f-75769961d157", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-683065417-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a92cea9e1182477ca669c506b42eda60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap989ee5cd-ff", "ovs_interfaceid": "989ee5cd-ff10-4bcc-9b11-017b23299187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-453e3070-d61a-46f9-8e30-0d71853f9e2d tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Converting VIF {"id": "989ee5cd-ff10-4bcc-9b11-017b23299187", "address": "fa:16:3e:92:5b:74", "network": {"id": "40132b20-6bfd-4f5a-8f6f-75769961d157", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-683065417-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a92cea9e1182477ca669c506b42eda60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap989ee5cd-ff", "ovs_interfaceid": "989ee5cd-ff10-4bcc-9b11-017b23299187", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-453e3070-d61a-46f9-8e30-0d71853f9e2d tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:92:5b:74,bridge_name='br-int',has_traffic_filtering=True,id=989ee5cd-ff10-4bcc-9b11-017b23299187,network=Network(40132b20-6bfd-4f5a-8f6f-75769961d157),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap989ee5cd-ff') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG os_vif [None req-453e3070-d61a-46f9-8e30-0d71853f9e2d tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:5b:74,bridge_name='br-int',has_traffic_filtering=True,id=989ee5cd-ff10-4bcc-9b11-017b23299187,network=Network(40132b20-6bfd-4f5a-8f6f-75769961d157),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap989ee5cd-ff') {{(pid=71605) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap989ee5cd-ff, bridge=br-int, if_exists=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:07:42 user nova-compute[71605]: INFO os_vif [None req-453e3070-d61a-46f9-8e30-0d71853f9e2d tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:5b:74,bridge_name='br-int',has_traffic_filtering=True,id=989ee5cd-ff10-4bcc-9b11-017b23299187,network=Network(40132b20-6bfd-4f5a-8f6f-75769961d157),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap989ee5cd-ff') Apr 20 16:07:42 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-453e3070-d61a-46f9-8e30-0d71853f9e2d tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Deleting instance files /opt/stack/data/nova/instances/a145fb51-4ca5-4cc4-b8bd-cd3665bef473_del Apr 20 16:07:42 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-453e3070-d61a-46f9-8e30-0d71853f9e2d tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Deletion of /opt/stack/data/nova/instances/a145fb51-4ca5-4cc4-b8bd-cd3665bef473_del complete Apr 20 16:07:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json" returned: 0 in 0.151s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:07:42 user nova-compute[71605]: INFO nova.compute.manager [None req-453e3070-d61a-46f9-8e30-0d71853f9e2d tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Took 0.79 seconds to destroy the instance on the hypervisor. Apr 20 16:07:42 user nova-compute[71605]: DEBUG oslo.service.loopingcall [None req-453e3070-d61a-46f9-8e30-0d71853f9e2d tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71605) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG nova.compute.manager [-] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Deallocating network for instance {{(pid=71605) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] deallocate_for_instance() {{(pid=71605) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:07:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dc918ed4-8bc6-4a4f-a189-d6cdd5817854/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:07:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dc918ed4-8bc6-4a4f-a189-d6cdd5817854/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:07:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dc918ed4-8bc6-4a4f-a189-d6cdd5817854/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:07:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/dc918ed4-8bc6-4a4f-a189-d6cdd5817854/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:07:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3ac0a246-e2fe-4164-9bc1-c96bb94e396f/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:07:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3ac0a246-e2fe-4164-9bc1-c96bb94e396f/disk --force-share --output=json" returned: 0 in 0.167s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:07:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3ac0a246-e2fe-4164-9bc1-c96bb94e396f/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:07:43 user nova-compute[71605]: DEBUG nova.network.neutron [req-855c0ac5-d780-4a66-989e-b222977f642c req-22afe9b2-96fd-487c-bbe8-5c18206f39a6 service nova] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Updated VIF entry in instance network info cache for port e068d7e5-dc70-4b18-8dd6-5726f7a3bc84. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:07:43 user nova-compute[71605]: DEBUG nova.network.neutron [req-855c0ac5-d780-4a66-989e-b222977f642c req-22afe9b2-96fd-487c-bbe8-5c18206f39a6 service nova] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Updating instance_info_cache with network_info: [{"id": "e068d7e5-dc70-4b18-8dd6-5726f7a3bc84", "address": "fa:16:3e:15:a2:f4", "network": {"id": "9de26342-0f6c-4d7d-96a5-d4ad35573211", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1378273293-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.9", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fbd2a72dddad4f2892243a33df4fa2d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape068d7e5-dc", "ovs_interfaceid": "e068d7e5-dc70-4b18-8dd6-5726f7a3bc84", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:07:43 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-855c0ac5-d780-4a66-989e-b222977f642c req-22afe9b2-96fd-487c-bbe8-5c18206f39a6 service nova] Releasing lock "refresh_cache-15d42ba7-cf47-4374-83b5-06d5242951b7" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:07:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3ac0a246-e2fe-4164-9bc1-c96bb94e396f/disk --force-share --output=json" returned: 0 in 0.152s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:07:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:07:43 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:07:43 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Took 0.83 seconds to deallocate network for instance. Apr 20 16:07:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:07:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:07:43 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-453e3070-d61a-46f9-8e30-0d71853f9e2d tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:07:43 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-453e3070-d61a-46f9-8e30-0d71853f9e2d tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:07:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:07:44 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:07:44 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:07:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=8252MB free_disk=26.28937530517578GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:07:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:07:44 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-453e3070-d61a-46f9-8e30-0d71853f9e2d tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:07:44 user nova-compute[71605]: DEBUG nova.compute.manager [req-c7d76e99-dcbe-46e0-a5da-6fb4f864ef7e req-b0f27748-4f41-4d72-a145-e6b3093f39ac service nova] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Received event network-vif-plugged-989ee5cd-ff10-4bcc-9b11-017b23299187 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:07:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-c7d76e99-dcbe-46e0-a5da-6fb4f864ef7e req-b0f27748-4f41-4d72-a145-e6b3093f39ac service nova] Acquiring lock "a145fb51-4ca5-4cc4-b8bd-cd3665bef473-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:07:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-c7d76e99-dcbe-46e0-a5da-6fb4f864ef7e req-b0f27748-4f41-4d72-a145-e6b3093f39ac service nova] Lock "a145fb51-4ca5-4cc4-b8bd-cd3665bef473-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:07:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-c7d76e99-dcbe-46e0-a5da-6fb4f864ef7e req-b0f27748-4f41-4d72-a145-e6b3093f39ac service nova] Lock "a145fb51-4ca5-4cc4-b8bd-cd3665bef473-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:07:44 user nova-compute[71605]: DEBUG nova.compute.manager [req-c7d76e99-dcbe-46e0-a5da-6fb4f864ef7e req-b0f27748-4f41-4d72-a145-e6b3093f39ac service nova] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] No waiting events found dispatching network-vif-plugged-989ee5cd-ff10-4bcc-9b11-017b23299187 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:07:44 user nova-compute[71605]: WARNING nova.compute.manager [req-c7d76e99-dcbe-46e0-a5da-6fb4f864ef7e req-b0f27748-4f41-4d72-a145-e6b3093f39ac service nova] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Received unexpected event network-vif-plugged-989ee5cd-ff10-4bcc-9b11-017b23299187 for instance with vm_state deleted and task_state None. Apr 20 16:07:44 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-453e3070-d61a-46f9-8e30-0d71853f9e2d tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:07:44 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-453e3070-d61a-46f9-8e30-0d71853f9e2d tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.654s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:07:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.103s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:07:44 user nova-compute[71605]: INFO nova.scheduler.client.report [None req-453e3070-d61a-46f9-8e30-0d71853f9e2d tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Deleted allocations for instance a145fb51-4ca5-4cc4-b8bd-cd3665bef473 Apr 20 16:07:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-453e3070-d61a-46f9-8e30-0d71853f9e2d tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "a145fb51-4ca5-4cc4-b8bd-cd3665bef473" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.495s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:07:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance d4ea4d29-b178-4da2-b971-76f97031b244 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:07:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance e8f62d46-e2dc-4870-adf1-f62d88bb653b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:07:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance fe0bde76-a4f8-4865-91af-2bd3790587a7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:07:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance dc918ed4-8bc6-4a4f-a189-d6cdd5817854 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:07:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance 15d42ba7-cf47-4374-83b5-06d5242951b7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:07:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance 3ac0a246-e2fe-4164-9bc1-c96bb94e396f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:07:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 6 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:07:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=1280MB phys_disk=40GB used_disk=6GB total_vcpus=12 used_vcpus=6 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:07:44 user nova-compute[71605]: DEBUG nova.compute.manager [req-a0a32b64-2304-42f1-a466-e415c81d4204 req-26b682dd-3603-40c6-b642-2079221e63da service nova] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Received event network-vif-deleted-989ee5cd-ff10-4bcc-9b11-017b23299187 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:07:44 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:07:44 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:07:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:07:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.336s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:07:45 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:07:45 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:07:45 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:07:45 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Rebuilding the list of instances to heal {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 16:07:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "refresh_cache-d4ea4d29-b178-4da2-b971-76f97031b244" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:07:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquired lock "refresh_cache-d4ea4d29-b178-4da2-b971-76f97031b244" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:07:45 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Forcefully refreshing network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 16:07:45 user nova-compute[71605]: DEBUG nova.objects.instance [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lazy-loading 'info_cache' on Instance uuid d4ea4d29-b178-4da2-b971-76f97031b244 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:07:46 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Updating instance_info_cache with network_info: [{"id": "0b36b1a4-9ab6-49cb-9a5e-afc32792783e", "address": "fa:16:3e:44:d8:d0", "network": {"id": "c36830a6-66f7-4f28-8879-e228da46cead", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-655574662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8444d3c8f554a56967917670b19dc37", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b36b1a4-9a", "ovs_interfaceid": "0b36b1a4-9ab6-49cb-9a5e-afc32792783e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:07:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Releasing lock "refresh_cache-d4ea4d29-b178-4da2-b971-76f97031b244" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:07:46 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Updated the network info_cache for instance {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 16:07:46 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:07:46 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:07:46 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:07:46 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:07:46 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:07:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:49 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:50 user nova-compute[71605]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:07:50 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] VM Stopped (Lifecycle Event) Apr 20 16:07:50 user nova-compute[71605]: DEBUG nova.compute.manager [None req-7622b7fd-58ae-467e-a222-c02d6e29997a None None] [instance: c2b84ca2-f67b-4219-b7e6-18d2029e998a] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:07:52 user nova-compute[71605]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:07:52 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] VM Stopped (Lifecycle Event) Apr 20 16:07:52 user nova-compute[71605]: DEBUG nova.compute.manager [None req-a479c3f5-0afb-4944-acc5-031de88fbcf4 None None] [instance: a760987f-1a65-4e42-8cef-73db9ef2db48] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:07:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Acquiring lock "4f841186-7958-4642-9050-9b048b61ebbb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:07:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Lock "4f841186-7958-4642-9050-9b048b61ebbb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:07:53 user nova-compute[71605]: DEBUG nova.compute.manager [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Starting instance... {{(pid=71605) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 16:07:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:07:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:07:53 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71605) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 16:07:53 user nova-compute[71605]: INFO nova.compute.claims [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Claim successful on node user Apr 20 16:07:54 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:07:54 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:07:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.322s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:07:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Start building networks asynchronously for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 16:07:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Allocating IP information in the background. {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 16:07:54 user nova-compute[71605]: DEBUG nova.network.neutron [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] allocate_for_instance() {{(pid=71605) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 16:07:54 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 16:07:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Start building block device mappings for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 16:07:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Start spawning the instance on the hypervisor. {{(pid=71605) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 16:07:54 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Creating instance directory {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 16:07:54 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Creating image(s) Apr 20 16:07:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Acquiring lock "/opt/stack/data/nova/instances/4f841186-7958-4642-9050-9b048b61ebbb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:07:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Lock "/opt/stack/data/nova/instances/4f841186-7958-4642-9050-9b048b61ebbb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:07:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Lock "/opt/stack/data/nova/instances/4f841186-7958-4642-9050-9b048b61ebbb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:07:54 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:07:54 user nova-compute[71605]: DEBUG nova.policy [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '403bee038ece4e9aa023aad83ee8f188', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ec92faae7a5d40f98409e9634a9dbf9b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71605) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 16:07:54 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.135s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:07:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Acquiring lock "4030659dc9e6940e4f224066d06e3784b1229890" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:07:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:07:54 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:07:54 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.135s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:07:54 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/4f841186-7958-4642-9050-9b048b61ebbb/disk 1073741824 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:07:54 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/4f841186-7958-4642-9050-9b048b61ebbb/disk 1073741824" returned: 0 in 0.048s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:07:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.190s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:07:54 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:07:54 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.141s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:07:54 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Checking if we can resize image /opt/stack/data/nova/instances/4f841186-7958-4642-9050-9b048b61ebbb/disk. size=1073741824 {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 16:07:54 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4f841186-7958-4642-9050-9b048b61ebbb/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:07:55 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4f841186-7958-4642-9050-9b048b61ebbb/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:07:55 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Cannot resize image /opt/stack/data/nova/instances/4f841186-7958-4642-9050-9b048b61ebbb/disk to a smaller size. {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 16:07:55 user nova-compute[71605]: DEBUG nova.objects.instance [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Lazy-loading 'migration_context' on Instance uuid 4f841186-7958-4642-9050-9b048b61ebbb {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:07:55 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Created local disks {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 16:07:55 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Ensure instance console log exists: /opt/stack/data/nova/instances/4f841186-7958-4642-9050-9b048b61ebbb/console.log {{(pid=71605) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 16:07:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:07:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:07:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:07:55 user nova-compute[71605]: DEBUG nova.network.neutron [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Successfully created port: fe7ac99b-2b51-4ae2-9903-9ae286328c8b {{(pid=71605) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 16:07:55 user nova-compute[71605]: DEBUG nova.network.neutron [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Successfully updated port: fe7ac99b-2b51-4ae2-9903-9ae286328c8b {{(pid=71605) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 16:07:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Acquiring lock "refresh_cache-4f841186-7958-4642-9050-9b048b61ebbb" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:07:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Acquired lock "refresh_cache-4f841186-7958-4642-9050-9b048b61ebbb" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:07:55 user nova-compute[71605]: DEBUG nova.network.neutron [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Building network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 16:07:55 user nova-compute[71605]: DEBUG nova.compute.manager [req-22de1b86-9c7f-4b6c-a245-57d32e3d20bd req-c63d848e-7428-407a-8b03-59d1deb3d648 service nova] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Received event network-changed-fe7ac99b-2b51-4ae2-9903-9ae286328c8b {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:07:55 user nova-compute[71605]: DEBUG nova.compute.manager [req-22de1b86-9c7f-4b6c-a245-57d32e3d20bd req-c63d848e-7428-407a-8b03-59d1deb3d648 service nova] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Refreshing instance network info cache due to event network-changed-fe7ac99b-2b51-4ae2-9903-9ae286328c8b. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:07:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-22de1b86-9c7f-4b6c-a245-57d32e3d20bd req-c63d848e-7428-407a-8b03-59d1deb3d648 service nova] Acquiring lock "refresh_cache-4f841186-7958-4642-9050-9b048b61ebbb" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:07:55 user nova-compute[71605]: DEBUG nova.network.neutron [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Instance cache missing network info. {{(pid=71605) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG nova.network.neutron [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Updating instance_info_cache with network_info: [{"id": "fe7ac99b-2b51-4ae2-9903-9ae286328c8b", "address": "fa:16:3e:6b:e1:59", "network": {"id": "6deca2c3-6467-44e2-aa30-6cf5abf230f5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1683519822-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "ec92faae7a5d40f98409e9634a9dbf9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe7ac99b-2b", "ovs_interfaceid": "fe7ac99b-2b51-4ae2-9903-9ae286328c8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Releasing lock "refresh_cache-4f841186-7958-4642-9050-9b048b61ebbb" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG nova.compute.manager [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Instance network_info: |[{"id": "fe7ac99b-2b51-4ae2-9903-9ae286328c8b", "address": "fa:16:3e:6b:e1:59", "network": {"id": "6deca2c3-6467-44e2-aa30-6cf5abf230f5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1683519822-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "ec92faae7a5d40f98409e9634a9dbf9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe7ac99b-2b", "ovs_interfaceid": "fe7ac99b-2b51-4ae2-9903-9ae286328c8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-22de1b86-9c7f-4b6c-a245-57d32e3d20bd req-c63d848e-7428-407a-8b03-59d1deb3d648 service nova] Acquired lock "refresh_cache-4f841186-7958-4642-9050-9b048b61ebbb" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG nova.network.neutron [req-22de1b86-9c7f-4b6c-a245-57d32e3d20bd req-c63d848e-7428-407a-8b03-59d1deb3d648 service nova] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Refreshing network info cache for port fe7ac99b-2b51-4ae2-9903-9ae286328c8b {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Start _get_guest_xml network_info=[{"id": "fe7ac99b-2b51-4ae2-9903-9ae286328c8b", "address": "fa:16:3e:6b:e1:59", "network": {"id": "6deca2c3-6467-44e2-aa30-6cf5abf230f5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1683519822-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "ec92faae7a5d40f98409e9634a9dbf9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe7ac99b-2b", "ovs_interfaceid": "fe7ac99b-2b51-4ae2-9903-9ae286328c8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'size': 0, 'encryption_format': None, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'image_id': '4ac69ea5-e5d7-40c8-864e-0a164d78a727'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 16:07:56 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:07:56 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:07:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71605) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T16:00:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=), allow threads: True {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Flavor limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Image limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Flavor pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Image pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Got 1 possible topologies {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:07:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-884360922',display_name='tempest-VolumesActionsTest-instance-884360922',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-884360922',id=17,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec92faae7a5d40f98409e9634a9dbf9b',ramdisk_id='',reservation_id='r-yae0i2gd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-1745644681',owner_user_name='tempest-VolumesActionsTest-1745644681-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:07:54Z,user_data=None,user_id='403bee038ece4e9aa023aad83ee8f188',uuid=4f841186-7958-4642-9050-9b048b61ebbb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fe7ac99b-2b51-4ae2-9903-9ae286328c8b", "address": "fa:16:3e:6b:e1:59", "network": {"id": "6deca2c3-6467-44e2-aa30-6cf5abf230f5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1683519822-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "ec92faae7a5d40f98409e9634a9dbf9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe7ac99b-2b", "ovs_interfaceid": "fe7ac99b-2b51-4ae2-9903-9ae286328c8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71605) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Converting VIF {"id": "fe7ac99b-2b51-4ae2-9903-9ae286328c8b", "address": "fa:16:3e:6b:e1:59", "network": {"id": "6deca2c3-6467-44e2-aa30-6cf5abf230f5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1683519822-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "ec92faae7a5d40f98409e9634a9dbf9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe7ac99b-2b", "ovs_interfaceid": "fe7ac99b-2b51-4ae2-9903-9ae286328c8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:e1:59,bridge_name='br-int',has_traffic_filtering=True,id=fe7ac99b-2b51-4ae2-9903-9ae286328c8b,network=Network(6deca2c3-6467-44e2-aa30-6cf5abf230f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe7ac99b-2b') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG nova.objects.instance [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Lazy-loading 'pci_devices' on Instance uuid 4f841186-7958-4642-9050-9b048b61ebbb {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] End _get_guest_xml xml= Apr 20 16:07:56 user nova-compute[71605]: 4f841186-7958-4642-9050-9b048b61ebbb Apr 20 16:07:56 user nova-compute[71605]: instance-00000011 Apr 20 16:07:56 user nova-compute[71605]: 131072 Apr 20 16:07:56 user nova-compute[71605]: 1 Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: tempest-VolumesActionsTest-instance-884360922 Apr 20 16:07:56 user nova-compute[71605]: 2023-04-20 16:07:56 Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: 128 Apr 20 16:07:56 user nova-compute[71605]: 1 Apr 20 16:07:56 user nova-compute[71605]: 0 Apr 20 16:07:56 user nova-compute[71605]: 0 Apr 20 16:07:56 user nova-compute[71605]: 1 Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: tempest-VolumesActionsTest-1745644681-project-member Apr 20 16:07:56 user nova-compute[71605]: tempest-VolumesActionsTest-1745644681 Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: OpenStack Foundation Apr 20 16:07:56 user nova-compute[71605]: OpenStack Nova Apr 20 16:07:56 user nova-compute[71605]: 0.0.0 Apr 20 16:07:56 user nova-compute[71605]: 4f841186-7958-4642-9050-9b048b61ebbb Apr 20 16:07:56 user nova-compute[71605]: 4f841186-7958-4642-9050-9b048b61ebbb Apr 20 16:07:56 user nova-compute[71605]: Virtual Machine Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: hvm Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Nehalem Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: /dev/urandom Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: Apr 20 16:07:56 user nova-compute[71605]: {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:07:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-884360922',display_name='tempest-VolumesActionsTest-instance-884360922',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-884360922',id=17,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ec92faae7a5d40f98409e9634a9dbf9b',ramdisk_id='',reservation_id='r-yae0i2gd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-1745644681',owner_user_name='tempest-VolumesActionsTest-1745644681-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:07:54Z,user_data=None,user_id='403bee038ece4e9aa023aad83ee8f188',uuid=4f841186-7958-4642-9050-9b048b61ebbb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fe7ac99b-2b51-4ae2-9903-9ae286328c8b", "address": "fa:16:3e:6b:e1:59", "network": {"id": "6deca2c3-6467-44e2-aa30-6cf5abf230f5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1683519822-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "ec92faae7a5d40f98409e9634a9dbf9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe7ac99b-2b", "ovs_interfaceid": "fe7ac99b-2b51-4ae2-9903-9ae286328c8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Converting VIF {"id": "fe7ac99b-2b51-4ae2-9903-9ae286328c8b", "address": "fa:16:3e:6b:e1:59", "network": {"id": "6deca2c3-6467-44e2-aa30-6cf5abf230f5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1683519822-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "ec92faae7a5d40f98409e9634a9dbf9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe7ac99b-2b", "ovs_interfaceid": "fe7ac99b-2b51-4ae2-9903-9ae286328c8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:e1:59,bridge_name='br-int',has_traffic_filtering=True,id=fe7ac99b-2b51-4ae2-9903-9ae286328c8b,network=Network(6deca2c3-6467-44e2-aa30-6cf5abf230f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe7ac99b-2b') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG os_vif [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:e1:59,bridge_name='br-int',has_traffic_filtering=True,id=fe7ac99b-2b51-4ae2-9903-9ae286328c8b,network=Network(6deca2c3-6467-44e2-aa30-6cf5abf230f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe7ac99b-2b') {{(pid=71605) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe7ac99b-2b, may_exist=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfe7ac99b-2b, col_values=(('external_ids', {'iface-id': 'fe7ac99b-2b51-4ae2-9903-9ae286328c8b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6b:e1:59', 'vm-uuid': '4f841186-7958-4642-9050-9b048b61ebbb'}),)) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:56 user nova-compute[71605]: INFO os_vif [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:e1:59,bridge_name='br-int',has_traffic_filtering=True,id=fe7ac99b-2b51-4ae2-9903-9ae286328c8b,network=Network(6deca2c3-6467-44e2-aa30-6cf5abf230f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe7ac99b-2b') Apr 20 16:07:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] No BDM found with device name vda, not building metadata. {{(pid=71605) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] No VIF found with MAC fa:16:3e:6b:e1:59, not building metadata {{(pid=71605) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG nova.network.neutron [req-22de1b86-9c7f-4b6c-a245-57d32e3d20bd req-c63d848e-7428-407a-8b03-59d1deb3d648 service nova] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Updated VIF entry in instance network info cache for port fe7ac99b-2b51-4ae2-9903-9ae286328c8b. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG nova.network.neutron [req-22de1b86-9c7f-4b6c-a245-57d32e3d20bd req-c63d848e-7428-407a-8b03-59d1deb3d648 service nova] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Updating instance_info_cache with network_info: [{"id": "fe7ac99b-2b51-4ae2-9903-9ae286328c8b", "address": "fa:16:3e:6b:e1:59", "network": {"id": "6deca2c3-6467-44e2-aa30-6cf5abf230f5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1683519822-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "ec92faae7a5d40f98409e9634a9dbf9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe7ac99b-2b", "ovs_interfaceid": "fe7ac99b-2b51-4ae2-9903-9ae286328c8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:07:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-22de1b86-9c7f-4b6c-a245-57d32e3d20bd req-c63d848e-7428-407a-8b03-59d1deb3d648 service nova] Releasing lock "refresh_cache-4f841186-7958-4642-9050-9b048b61ebbb" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:07:57 user nova-compute[71605]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:07:57 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] VM Stopped (Lifecycle Event) Apr 20 16:07:57 user nova-compute[71605]: DEBUG nova.compute.manager [None req-0db4ae36-7506-4b8c-af35-a891f2417fde None None] [instance: a145fb51-4ca5-4cc4-b8bd-cd3665bef473] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:07:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:58 user nova-compute[71605]: DEBUG nova.compute.manager [req-90534178-4d34-498f-8f26-4545095b6ff7 req-50a7f591-30b6-49eb-a6d1-6aa55ced25be service nova] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Received event network-vif-plugged-fe7ac99b-2b51-4ae2-9903-9ae286328c8b {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:07:58 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-90534178-4d34-498f-8f26-4545095b6ff7 req-50a7f591-30b6-49eb-a6d1-6aa55ced25be service nova] Acquiring lock "4f841186-7958-4642-9050-9b048b61ebbb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:07:58 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-90534178-4d34-498f-8f26-4545095b6ff7 req-50a7f591-30b6-49eb-a6d1-6aa55ced25be service nova] Lock "4f841186-7958-4642-9050-9b048b61ebbb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:07:58 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-90534178-4d34-498f-8f26-4545095b6ff7 req-50a7f591-30b6-49eb-a6d1-6aa55ced25be service nova] Lock "4f841186-7958-4642-9050-9b048b61ebbb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:07:58 user nova-compute[71605]: DEBUG nova.compute.manager [req-90534178-4d34-498f-8f26-4545095b6ff7 req-50a7f591-30b6-49eb-a6d1-6aa55ced25be service nova] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] No waiting events found dispatching network-vif-plugged-fe7ac99b-2b51-4ae2-9903-9ae286328c8b {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:07:58 user nova-compute[71605]: WARNING nova.compute.manager [req-90534178-4d34-498f-8f26-4545095b6ff7 req-50a7f591-30b6-49eb-a6d1-6aa55ced25be service nova] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Received unexpected event network-vif-plugged-fe7ac99b-2b51-4ae2-9903-9ae286328c8b for instance with vm_state building and task_state spawning. Apr 20 16:07:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:07:59 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Resumed> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:07:59 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] VM Resumed (Lifecycle Event) Apr 20 16:07:59 user nova-compute[71605]: DEBUG nova.compute.manager [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Instance event wait completed in 0 seconds for {{(pid=71605) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 16:07:59 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Guest created on hypervisor {{(pid=71605) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 16:07:59 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Instance spawned successfully. Apr 20 16:07:59 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 16:07:59 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:07:59 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:07:59 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Found default for hw_cdrom_bus of ide {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:07:59 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Found default for hw_disk_bus of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:07:59 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Found default for hw_input_bus of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:07:59 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Found default for hw_pointer_model of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:07:59 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Found default for hw_video_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:07:59 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Found default for hw_vif_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:07:59 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:07:59 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Started> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:07:59 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] VM Started (Lifecycle Event) Apr 20 16:07:59 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:07:59 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:07:59 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:08:00 user nova-compute[71605]: INFO nova.compute.manager [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Took 5.53 seconds to spawn the instance on the hypervisor. Apr 20 16:08:00 user nova-compute[71605]: DEBUG nova.compute.manager [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:08:00 user nova-compute[71605]: INFO nova.compute.manager [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Took 6.16 seconds to build instance. Apr 20 16:08:00 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-80667c4e-636a-47b0-9f62-145495d3867d tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Lock "4f841186-7958-4642-9050-9b048b61ebbb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.256s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:00 user nova-compute[71605]: DEBUG nova.compute.manager [req-629a15ab-525f-4620-af7c-486edbc11782 req-31a62902-b507-462e-9be5-c99fba986b3d service nova] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Received event network-vif-plugged-fe7ac99b-2b51-4ae2-9903-9ae286328c8b {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:08:00 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-629a15ab-525f-4620-af7c-486edbc11782 req-31a62902-b507-462e-9be5-c99fba986b3d service nova] Acquiring lock "4f841186-7958-4642-9050-9b048b61ebbb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:00 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-629a15ab-525f-4620-af7c-486edbc11782 req-31a62902-b507-462e-9be5-c99fba986b3d service nova] Lock "4f841186-7958-4642-9050-9b048b61ebbb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:00 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-629a15ab-525f-4620-af7c-486edbc11782 req-31a62902-b507-462e-9be5-c99fba986b3d service nova] Lock "4f841186-7958-4642-9050-9b048b61ebbb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:00 user nova-compute[71605]: DEBUG nova.compute.manager [req-629a15ab-525f-4620-af7c-486edbc11782 req-31a62902-b507-462e-9be5-c99fba986b3d service nova] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] No waiting events found dispatching network-vif-plugged-fe7ac99b-2b51-4ae2-9903-9ae286328c8b {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:08:00 user nova-compute[71605]: WARNING nova.compute.manager [req-629a15ab-525f-4620-af7c-486edbc11782 req-31a62902-b507-462e-9be5-c99fba986b3d service nova] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Received unexpected event network-vif-plugged-fe7ac99b-2b51-4ae2-9903-9ae286328c8b for instance with vm_state active and task_state None. Apr 20 16:08:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:04 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:08:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:08:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:08:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:08:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:14 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:19 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:21 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:24 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:24 user nova-compute[71605]: DEBUG nova.compute.manager [req-9a17c8ef-99d5-46c8-936c-679340b0c68e req-9f891d1b-fc61-4b6c-adf7-2b01a8e41b9d service nova] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Received event network-changed-64234034-3bc7-49ec-adb2-d425da7301e7 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:08:24 user nova-compute[71605]: DEBUG nova.compute.manager [req-9a17c8ef-99d5-46c8-936c-679340b0c68e req-9f891d1b-fc61-4b6c-adf7-2b01a8e41b9d service nova] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Refreshing instance network info cache due to event network-changed-64234034-3bc7-49ec-adb2-d425da7301e7. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:08:24 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-9a17c8ef-99d5-46c8-936c-679340b0c68e req-9f891d1b-fc61-4b6c-adf7-2b01a8e41b9d service nova] Acquiring lock "refresh_cache-3ac0a246-e2fe-4164-9bc1-c96bb94e396f" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:08:24 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-9a17c8ef-99d5-46c8-936c-679340b0c68e req-9f891d1b-fc61-4b6c-adf7-2b01a8e41b9d service nova] Acquired lock "refresh_cache-3ac0a246-e2fe-4164-9bc1-c96bb94e396f" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:08:24 user nova-compute[71605]: DEBUG nova.network.neutron [req-9a17c8ef-99d5-46c8-936c-679340b0c68e req-9f891d1b-fc61-4b6c-adf7-2b01a8e41b9d service nova] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Refreshing network info cache for port 64234034-3bc7-49ec-adb2-d425da7301e7 {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:08:24 user nova-compute[71605]: DEBUG nova.network.neutron [req-9a17c8ef-99d5-46c8-936c-679340b0c68e req-9f891d1b-fc61-4b6c-adf7-2b01a8e41b9d service nova] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Updated VIF entry in instance network info cache for port 64234034-3bc7-49ec-adb2-d425da7301e7. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:08:24 user nova-compute[71605]: DEBUG nova.network.neutron [req-9a17c8ef-99d5-46c8-936c-679340b0c68e req-9f891d1b-fc61-4b6c-adf7-2b01a8e41b9d service nova] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Updating instance_info_cache with network_info: [{"id": "64234034-3bc7-49ec-adb2-d425da7301e7", "address": "fa:16:3e:4b:81:b9", "network": {"id": "27275346-fa92-4114-a62b-d59f0212eb8f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-871140467-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.86", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "77f831070f5847bda788f6f0fcfedb03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap64234034-3b", "ovs_interfaceid": "64234034-3bc7-49ec-adb2-d425da7301e7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:08:24 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-9a17c8ef-99d5-46c8-936c-679340b0c68e req-9f891d1b-fc61-4b6c-adf7-2b01a8e41b9d service nova] Releasing lock "refresh_cache-3ac0a246-e2fe-4164-9bc1-c96bb94e396f" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:08:26 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-dea51ff0-cfac-4c00-bbe2-0c44ceab5650 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Acquiring lock "3ac0a246-e2fe-4164-9bc1-c96bb94e396f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:26 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-dea51ff0-cfac-4c00-bbe2-0c44ceab5650 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "3ac0a246-e2fe-4164-9bc1-c96bb94e396f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:26 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-dea51ff0-cfac-4c00-bbe2-0c44ceab5650 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Acquiring lock "3ac0a246-e2fe-4164-9bc1-c96bb94e396f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:26 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-dea51ff0-cfac-4c00-bbe2-0c44ceab5650 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "3ac0a246-e2fe-4164-9bc1-c96bb94e396f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:26 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-dea51ff0-cfac-4c00-bbe2-0c44ceab5650 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "3ac0a246-e2fe-4164-9bc1-c96bb94e396f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:26 user nova-compute[71605]: INFO nova.compute.manager [None req-dea51ff0-cfac-4c00-bbe2-0c44ceab5650 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Terminating instance Apr 20 16:08:26 user nova-compute[71605]: DEBUG nova.compute.manager [None req-dea51ff0-cfac-4c00-bbe2-0c44ceab5650 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Start destroying the instance on the hypervisor. {{(pid=71605) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 16:08:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:26 user nova-compute[71605]: DEBUG nova.compute.manager [req-fdb10f1c-aadc-43b7-a938-91c647d2f6da req-66700460-a8bd-4df0-97ba-036eb63f06e8 service nova] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Received event network-vif-unplugged-64234034-3bc7-49ec-adb2-d425da7301e7 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:08:26 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-fdb10f1c-aadc-43b7-a938-91c647d2f6da req-66700460-a8bd-4df0-97ba-036eb63f06e8 service nova] Acquiring lock "3ac0a246-e2fe-4164-9bc1-c96bb94e396f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:26 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-fdb10f1c-aadc-43b7-a938-91c647d2f6da req-66700460-a8bd-4df0-97ba-036eb63f06e8 service nova] Lock "3ac0a246-e2fe-4164-9bc1-c96bb94e396f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:26 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-fdb10f1c-aadc-43b7-a938-91c647d2f6da req-66700460-a8bd-4df0-97ba-036eb63f06e8 service nova] Lock "3ac0a246-e2fe-4164-9bc1-c96bb94e396f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:26 user nova-compute[71605]: DEBUG nova.compute.manager [req-fdb10f1c-aadc-43b7-a938-91c647d2f6da req-66700460-a8bd-4df0-97ba-036eb63f06e8 service nova] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] No waiting events found dispatching network-vif-unplugged-64234034-3bc7-49ec-adb2-d425da7301e7 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:08:26 user nova-compute[71605]: DEBUG nova.compute.manager [req-fdb10f1c-aadc-43b7-a938-91c647d2f6da req-66700460-a8bd-4df0-97ba-036eb63f06e8 service nova] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Received event network-vif-unplugged-64234034-3bc7-49ec-adb2-d425da7301e7 for instance with task_state deleting. {{(pid=71605) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 16:08:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:26 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Instance destroyed successfully. Apr 20 16:08:26 user nova-compute[71605]: DEBUG nova.objects.instance [None req-dea51ff0-cfac-4c00-bbe2-0c44ceab5650 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lazy-loading 'resources' on Instance uuid 3ac0a246-e2fe-4164-9bc1-c96bb94e396f {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:08:26 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-dea51ff0-cfac-4c00-bbe2-0c44ceab5650 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:06:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-673629515',display_name='tempest-AttachVolumeTestJSON-server-673629515',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-673629515',id=16,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGZM/GSdMLabrovqLAcYqsj7WFJ8JEyf+MdNfn+7QjGV1E8w98tErRtmHPGjmfT7XNg40a0X/HuPTbbuPZsBHAMaW5V6k6XIxdNK2JrY++eeL0UNW7ZwAqAXZ0rf7wYalg==',key_name='tempest-keypair-398342575',keypairs=,launch_index=0,launched_at=2023-04-20T16:06:40Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='77f831070f5847bda788f6f0fcfedb03',ramdisk_id='',reservation_id='r-fhivp1qm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeTestJSON-1838780462',owner_user_name='tempest-AttachVolumeTestJSON-1838780462-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T16:06:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='1c8f57b12bc749888ea89bdbee258811',uuid=3ac0a246-e2fe-4164-9bc1-c96bb94e396f,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "64234034-3bc7-49ec-adb2-d425da7301e7", "address": "fa:16:3e:4b:81:b9", "network": {"id": "27275346-fa92-4114-a62b-d59f0212eb8f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-871140467-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.86", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "77f831070f5847bda788f6f0fcfedb03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap64234034-3b", "ovs_interfaceid": "64234034-3bc7-49ec-adb2-d425da7301e7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 16:08:26 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-dea51ff0-cfac-4c00-bbe2-0c44ceab5650 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Converting VIF {"id": "64234034-3bc7-49ec-adb2-d425da7301e7", "address": "fa:16:3e:4b:81:b9", "network": {"id": "27275346-fa92-4114-a62b-d59f0212eb8f", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-871140467-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.86", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "77f831070f5847bda788f6f0fcfedb03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap64234034-3b", "ovs_interfaceid": "64234034-3bc7-49ec-adb2-d425da7301e7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:08:26 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-dea51ff0-cfac-4c00-bbe2-0c44ceab5650 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4b:81:b9,bridge_name='br-int',has_traffic_filtering=True,id=64234034-3bc7-49ec-adb2-d425da7301e7,network=Network(27275346-fa92-4114-a62b-d59f0212eb8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap64234034-3b') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:08:26 user nova-compute[71605]: DEBUG os_vif [None req-dea51ff0-cfac-4c00-bbe2-0c44ceab5650 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4b:81:b9,bridge_name='br-int',has_traffic_filtering=True,id=64234034-3bc7-49ec-adb2-d425da7301e7,network=Network(27275346-fa92-4114-a62b-d59f0212eb8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap64234034-3b') {{(pid=71605) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 16:08:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap64234034-3b, bridge=br-int, if_exists=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:08:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:08:26 user nova-compute[71605]: INFO os_vif [None req-dea51ff0-cfac-4c00-bbe2-0c44ceab5650 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4b:81:b9,bridge_name='br-int',has_traffic_filtering=True,id=64234034-3bc7-49ec-adb2-d425da7301e7,network=Network(27275346-fa92-4114-a62b-d59f0212eb8f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap64234034-3b') Apr 20 16:08:26 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-dea51ff0-cfac-4c00-bbe2-0c44ceab5650 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Deleting instance files /opt/stack/data/nova/instances/3ac0a246-e2fe-4164-9bc1-c96bb94e396f_del Apr 20 16:08:26 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-dea51ff0-cfac-4c00-bbe2-0c44ceab5650 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Deletion of /opt/stack/data/nova/instances/3ac0a246-e2fe-4164-9bc1-c96bb94e396f_del complete Apr 20 16:08:26 user nova-compute[71605]: INFO nova.compute.manager [None req-dea51ff0-cfac-4c00-bbe2-0c44ceab5650 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Took 0.66 seconds to destroy the instance on the hypervisor. Apr 20 16:08:26 user nova-compute[71605]: DEBUG oslo.service.loopingcall [None req-dea51ff0-cfac-4c00-bbe2-0c44ceab5650 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71605) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 16:08:26 user nova-compute[71605]: DEBUG nova.compute.manager [-] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Deallocating network for instance {{(pid=71605) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 16:08:26 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] deallocate_for_instance() {{(pid=71605) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 16:08:27 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:08:27 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Took 0.72 seconds to deallocate network for instance. Apr 20 16:08:27 user nova-compute[71605]: DEBUG nova.compute.manager [req-3951f69f-929a-4dc6-b663-f897e84600e4 req-fbf08cd3-2388-46cc-9498-d7d748b21d1d service nova] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Received event network-vif-deleted-64234034-3bc7-49ec-adb2-d425da7301e7 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:08:27 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-dea51ff0-cfac-4c00-bbe2-0c44ceab5650 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:27 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-dea51ff0-cfac-4c00-bbe2-0c44ceab5650 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:27 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-dea51ff0-cfac-4c00-bbe2-0c44ceab5650 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:08:27 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-dea51ff0-cfac-4c00-bbe2-0c44ceab5650 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:08:27 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-dea51ff0-cfac-4c00-bbe2-0c44ceab5650 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.257s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:27 user nova-compute[71605]: INFO nova.scheduler.client.report [None req-dea51ff0-cfac-4c00-bbe2-0c44ceab5650 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Deleted allocations for instance 3ac0a246-e2fe-4164-9bc1-c96bb94e396f Apr 20 16:08:27 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-dea51ff0-cfac-4c00-bbe2-0c44ceab5650 tempest-AttachVolumeTestJSON-1838780462 tempest-AttachVolumeTestJSON-1838780462-project-member] Lock "3ac0a246-e2fe-4164-9bc1-c96bb94e396f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.847s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:28 user nova-compute[71605]: DEBUG nova.compute.manager [req-a7ad98e0-4490-443d-94e2-38cc4b604954 req-a90e293d-30c8-422c-82f1-a04040ed292a service nova] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Received event network-vif-plugged-64234034-3bc7-49ec-adb2-d425da7301e7 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:08:28 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-a7ad98e0-4490-443d-94e2-38cc4b604954 req-a90e293d-30c8-422c-82f1-a04040ed292a service nova] Acquiring lock "3ac0a246-e2fe-4164-9bc1-c96bb94e396f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:28 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-a7ad98e0-4490-443d-94e2-38cc4b604954 req-a90e293d-30c8-422c-82f1-a04040ed292a service nova] Lock "3ac0a246-e2fe-4164-9bc1-c96bb94e396f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:28 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-a7ad98e0-4490-443d-94e2-38cc4b604954 req-a90e293d-30c8-422c-82f1-a04040ed292a service nova] Lock "3ac0a246-e2fe-4164-9bc1-c96bb94e396f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:28 user nova-compute[71605]: DEBUG nova.compute.manager [req-a7ad98e0-4490-443d-94e2-38cc4b604954 req-a90e293d-30c8-422c-82f1-a04040ed292a service nova] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] No waiting events found dispatching network-vif-plugged-64234034-3bc7-49ec-adb2-d425da7301e7 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:08:28 user nova-compute[71605]: WARNING nova.compute.manager [req-a7ad98e0-4490-443d-94e2-38cc4b604954 req-a90e293d-30c8-422c-82f1-a04040ed292a service nova] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Received unexpected event network-vif-plugged-64234034-3bc7-49ec-adb2-d425da7301e7 for instance with vm_state deleted and task_state None. Apr 20 16:08:28 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquiring lock "f6d19a54-ca7e-46fc-af21-6a7ddbc6604f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:28 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "f6d19a54-ca7e-46fc-af21-6a7ddbc6604f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:28 user nova-compute[71605]: DEBUG nova.compute.manager [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Starting instance... {{(pid=71605) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 16:08:29 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:29 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:29 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71605) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 16:08:29 user nova-compute[71605]: INFO nova.compute.claims [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Claim successful on node user Apr 20 16:08:29 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:08:29 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:08:29 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.310s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:29 user nova-compute[71605]: DEBUG nova.compute.manager [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Start building networks asynchronously for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 16:08:29 user nova-compute[71605]: DEBUG nova.compute.manager [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Allocating IP information in the background. {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 16:08:29 user nova-compute[71605]: DEBUG nova.network.neutron [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] allocate_for_instance() {{(pid=71605) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 16:08:29 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 16:08:29 user nova-compute[71605]: DEBUG nova.compute.manager [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Start building block device mappings for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 16:08:29 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:29 user nova-compute[71605]: DEBUG nova.policy [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '690c49feae904687826fb959ba5ba283', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '71cf2664111f45788d24092e8ceede9c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71605) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 16:08:29 user nova-compute[71605]: DEBUG nova.compute.manager [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Start spawning the instance on the hypervisor. {{(pid=71605) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 16:08:29 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Creating instance directory {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 16:08:29 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Creating image(s) Apr 20 16:08:29 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquiring lock "/opt/stack/data/nova/instances/f6d19a54-ca7e-46fc-af21-6a7ddbc6604f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:29 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "/opt/stack/data/nova/instances/f6d19a54-ca7e-46fc-af21-6a7ddbc6604f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:29 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "/opt/stack/data/nova/instances/f6d19a54-ca7e-46fc-af21-6a7ddbc6604f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:29 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:08:29 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.142s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:08:29 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquiring lock "4030659dc9e6940e4f224066d06e3784b1229890" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:29 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:29 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:08:29 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.147s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:08:29 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/f6d19a54-ca7e-46fc-af21-6a7ddbc6604f/disk 1073741824 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:08:29 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/f6d19a54-ca7e-46fc-af21-6a7ddbc6604f/disk 1073741824" returned: 0 in 0.056s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:08:29 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.212s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:29 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:08:30 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.151s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:08:30 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Checking if we can resize image /opt/stack/data/nova/instances/f6d19a54-ca7e-46fc-af21-6a7ddbc6604f/disk. size=1073741824 {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 16:08:30 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f6d19a54-ca7e-46fc-af21-6a7ddbc6604f/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:08:30 user nova-compute[71605]: DEBUG nova.network.neutron [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Successfully created port: 9223f738-4299-44ed-8e8f-c39e3353e39d {{(pid=71605) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 16:08:30 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f6d19a54-ca7e-46fc-af21-6a7ddbc6604f/disk --force-share --output=json" returned: 0 in 0.159s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:08:30 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Cannot resize image /opt/stack/data/nova/instances/f6d19a54-ca7e-46fc-af21-6a7ddbc6604f/disk to a smaller size. {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 16:08:30 user nova-compute[71605]: DEBUG nova.objects.instance [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lazy-loading 'migration_context' on Instance uuid f6d19a54-ca7e-46fc-af21-6a7ddbc6604f {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:08:30 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Created local disks {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 16:08:30 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Ensure instance console log exists: /opt/stack/data/nova/instances/f6d19a54-ca7e-46fc-af21-6a7ddbc6604f/console.log {{(pid=71605) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 16:08:30 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:30 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:30 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:30 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.network.neutron [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Successfully updated port: 9223f738-4299-44ed-8e8f-c39e3353e39d {{(pid=71605) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquiring lock "refresh_cache-f6d19a54-ca7e-46fc-af21-6a7ddbc6604f" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquired lock "refresh_cache-f6d19a54-ca7e-46fc-af21-6a7ddbc6604f" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.network.neutron [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Building network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.compute.manager [req-a7b62f94-8b62-4b33-8469-53d0bfd88218 req-1838486e-464b-4510-89c9-a55de6e8b680 service nova] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Received event network-changed-9223f738-4299-44ed-8e8f-c39e3353e39d {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.compute.manager [req-a7b62f94-8b62-4b33-8469-53d0bfd88218 req-1838486e-464b-4510-89c9-a55de6e8b680 service nova] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Refreshing instance network info cache due to event network-changed-9223f738-4299-44ed-8e8f-c39e3353e39d. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-a7b62f94-8b62-4b33-8469-53d0bfd88218 req-1838486e-464b-4510-89c9-a55de6e8b680 service nova] Acquiring lock "refresh_cache-f6d19a54-ca7e-46fc-af21-6a7ddbc6604f" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.network.neutron [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Instance cache missing network info. {{(pid=71605) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.network.neutron [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Updating instance_info_cache with network_info: [{"id": "9223f738-4299-44ed-8e8f-c39e3353e39d", "address": "fa:16:3e:d6:0d:19", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9223f738-42", "ovs_interfaceid": "9223f738-4299-44ed-8e8f-c39e3353e39d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Releasing lock "refresh_cache-f6d19a54-ca7e-46fc-af21-6a7ddbc6604f" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.compute.manager [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Instance network_info: |[{"id": "9223f738-4299-44ed-8e8f-c39e3353e39d", "address": "fa:16:3e:d6:0d:19", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9223f738-42", "ovs_interfaceid": "9223f738-4299-44ed-8e8f-c39e3353e39d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-a7b62f94-8b62-4b33-8469-53d0bfd88218 req-1838486e-464b-4510-89c9-a55de6e8b680 service nova] Acquired lock "refresh_cache-f6d19a54-ca7e-46fc-af21-6a7ddbc6604f" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.network.neutron [req-a7b62f94-8b62-4b33-8469-53d0bfd88218 req-1838486e-464b-4510-89c9-a55de6e8b680 service nova] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Refreshing network info cache for port 9223f738-4299-44ed-8e8f-c39e3353e39d {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Start _get_guest_xml network_info=[{"id": "9223f738-4299-44ed-8e8f-c39e3353e39d", "address": "fa:16:3e:d6:0d:19", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9223f738-42", "ovs_interfaceid": "9223f738-4299-44ed-8e8f-c39e3353e39d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'size': 0, 'encryption_format': None, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'image_id': '4ac69ea5-e5d7-40c8-864e-0a164d78a727'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 16:08:31 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:08:31 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71605) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T16:00:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=), allow threads: True {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Flavor limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Image limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Flavor pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Image pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Got 1 possible topologies {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:08:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-407901735',display_name='tempest-AttachVolumeNegativeTest-server-407901735',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-407901735',id=18,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG4OSrhcAGhvt/1td6lSrBTjgRGg10CjLCL1EmuHW6q7czt1RgqBpWAsoiQyoSTiBzeuddL47KN04jWageIBB5Wx1XgbbdYqtpRoz/r1eG4scj8/SDy6MikQDo96K7/ZPw==',key_name='tempest-keypair-627338015',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71cf2664111f45788d24092e8ceede9c',ramdisk_id='',reservation_id='r-wsud3dti',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-308436039',owner_user_name='tempest-AttachVolumeNegativeTest-308436039-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:08:29Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='690c49feae904687826fb959ba5ba283',uuid=f6d19a54-ca7e-46fc-af21-6a7ddbc6604f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9223f738-4299-44ed-8e8f-c39e3353e39d", "address": "fa:16:3e:d6:0d:19", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9223f738-42", "ovs_interfaceid": "9223f738-4299-44ed-8e8f-c39e3353e39d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71605) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Converting VIF {"id": "9223f738-4299-44ed-8e8f-c39e3353e39d", "address": "fa:16:3e:d6:0d:19", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9223f738-42", "ovs_interfaceid": "9223f738-4299-44ed-8e8f-c39e3353e39d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:0d:19,bridge_name='br-int',has_traffic_filtering=True,id=9223f738-4299-44ed-8e8f-c39e3353e39d,network=Network(2dc9b3da-0124-4718-9f70-a131cd030480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9223f738-42') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.objects.instance [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lazy-loading 'pci_devices' on Instance uuid f6d19a54-ca7e-46fc-af21-6a7ddbc6604f {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] End _get_guest_xml xml= Apr 20 16:08:31 user nova-compute[71605]: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f Apr 20 16:08:31 user nova-compute[71605]: instance-00000012 Apr 20 16:08:31 user nova-compute[71605]: 131072 Apr 20 16:08:31 user nova-compute[71605]: 1 Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: tempest-AttachVolumeNegativeTest-server-407901735 Apr 20 16:08:31 user nova-compute[71605]: 2023-04-20 16:08:31 Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: 128 Apr 20 16:08:31 user nova-compute[71605]: 1 Apr 20 16:08:31 user nova-compute[71605]: 0 Apr 20 16:08:31 user nova-compute[71605]: 0 Apr 20 16:08:31 user nova-compute[71605]: 1 Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: tempest-AttachVolumeNegativeTest-308436039-project-member Apr 20 16:08:31 user nova-compute[71605]: tempest-AttachVolumeNegativeTest-308436039 Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: OpenStack Foundation Apr 20 16:08:31 user nova-compute[71605]: OpenStack Nova Apr 20 16:08:31 user nova-compute[71605]: 0.0.0 Apr 20 16:08:31 user nova-compute[71605]: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f Apr 20 16:08:31 user nova-compute[71605]: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f Apr 20 16:08:31 user nova-compute[71605]: Virtual Machine Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: hvm Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Nehalem Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: /dev/urandom Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: Apr 20 16:08:31 user nova-compute[71605]: {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:08:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-407901735',display_name='tempest-AttachVolumeNegativeTest-server-407901735',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-407901735',id=18,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG4OSrhcAGhvt/1td6lSrBTjgRGg10CjLCL1EmuHW6q7czt1RgqBpWAsoiQyoSTiBzeuddL47KN04jWageIBB5Wx1XgbbdYqtpRoz/r1eG4scj8/SDy6MikQDo96K7/ZPw==',key_name='tempest-keypair-627338015',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71cf2664111f45788d24092e8ceede9c',ramdisk_id='',reservation_id='r-wsud3dti',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-308436039',owner_user_name='tempest-AttachVolumeNegativeTest-308436039-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:08:29Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='690c49feae904687826fb959ba5ba283',uuid=f6d19a54-ca7e-46fc-af21-6a7ddbc6604f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9223f738-4299-44ed-8e8f-c39e3353e39d", "address": "fa:16:3e:d6:0d:19", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9223f738-42", "ovs_interfaceid": "9223f738-4299-44ed-8e8f-c39e3353e39d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Converting VIF {"id": "9223f738-4299-44ed-8e8f-c39e3353e39d", "address": "fa:16:3e:d6:0d:19", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9223f738-42", "ovs_interfaceid": "9223f738-4299-44ed-8e8f-c39e3353e39d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:0d:19,bridge_name='br-int',has_traffic_filtering=True,id=9223f738-4299-44ed-8e8f-c39e3353e39d,network=Network(2dc9b3da-0124-4718-9f70-a131cd030480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9223f738-42') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG os_vif [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:0d:19,bridge_name='br-int',has_traffic_filtering=True,id=9223f738-4299-44ed-8e8f-c39e3353e39d,network=Network(2dc9b3da-0124-4718-9f70-a131cd030480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9223f738-42') {{(pid=71605) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9223f738-42, may_exist=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9223f738-42, col_values=(('external_ids', {'iface-id': '9223f738-4299-44ed-8e8f-c39e3353e39d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d6:0d:19', 'vm-uuid': 'f6d19a54-ca7e-46fc-af21-6a7ddbc6604f'}),)) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:31 user nova-compute[71605]: INFO os_vif [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:0d:19,bridge_name='br-int',has_traffic_filtering=True,id=9223f738-4299-44ed-8e8f-c39e3353e39d,network=Network(2dc9b3da-0124-4718-9f70-a131cd030480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9223f738-42') Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] No BDM found with device name vda, not building metadata. {{(pid=71605) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] No VIF found with MAC fa:16:3e:d6:0d:19, not building metadata {{(pid=71605) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.network.neutron [req-a7b62f94-8b62-4b33-8469-53d0bfd88218 req-1838486e-464b-4510-89c9-a55de6e8b680 service nova] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Updated VIF entry in instance network info cache for port 9223f738-4299-44ed-8e8f-c39e3353e39d. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG nova.network.neutron [req-a7b62f94-8b62-4b33-8469-53d0bfd88218 req-1838486e-464b-4510-89c9-a55de6e8b680 service nova] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Updating instance_info_cache with network_info: [{"id": "9223f738-4299-44ed-8e8f-c39e3353e39d", "address": "fa:16:3e:d6:0d:19", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9223f738-42", "ovs_interfaceid": "9223f738-4299-44ed-8e8f-c39e3353e39d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:08:31 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-a7b62f94-8b62-4b33-8469-53d0bfd88218 req-1838486e-464b-4510-89c9-a55de6e8b680 service nova] Releasing lock "refresh_cache-f6d19a54-ca7e-46fc-af21-6a7ddbc6604f" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:08:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-56741221-d9d7-4f0e-8629-9a6021dc62c2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Acquiring lock "fe0bde76-a4f8-4865-91af-2bd3790587a7" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-56741221-d9d7-4f0e-8629-9a6021dc62c2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-56741221-d9d7-4f0e-8629-9a6021dc62c2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Acquiring lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-56741221-d9d7-4f0e-8629-9a6021dc62c2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-56741221-d9d7-4f0e-8629-9a6021dc62c2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.002s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:32 user nova-compute[71605]: INFO nova.compute.manager [None req-56741221-d9d7-4f0e-8629-9a6021dc62c2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Terminating instance Apr 20 16:08:32 user nova-compute[71605]: DEBUG nova.compute.manager [None req-56741221-d9d7-4f0e-8629-9a6021dc62c2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Start destroying the instance on the hypervisor. {{(pid=71605) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 16:08:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-ccc4485d-07c2-4101-b5af-b6e3e0fec6d4 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Acquiring lock "dc918ed4-8bc6-4a4f-a189-d6cdd5817854" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-ccc4485d-07c2-4101-b5af-b6e3e0fec6d4 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "dc918ed4-8bc6-4a4f-a189-d6cdd5817854" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-ccc4485d-07c2-4101-b5af-b6e3e0fec6d4 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Acquiring lock "dc918ed4-8bc6-4a4f-a189-d6cdd5817854-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-ccc4485d-07c2-4101-b5af-b6e3e0fec6d4 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "dc918ed4-8bc6-4a4f-a189-d6cdd5817854-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-ccc4485d-07c2-4101-b5af-b6e3e0fec6d4 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "dc918ed4-8bc6-4a4f-a189-d6cdd5817854-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:32 user nova-compute[71605]: INFO nova.compute.manager [None req-ccc4485d-07c2-4101-b5af-b6e3e0fec6d4 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Terminating instance Apr 20 16:08:32 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ccc4485d-07c2-4101-b5af-b6e3e0fec6d4 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Start destroying the instance on the hypervisor. {{(pid=71605) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 16:08:33 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:33 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:33 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:33 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:33 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:33 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:33 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:33 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:33 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:33 user nova-compute[71605]: DEBUG nova.compute.manager [req-430cd2e1-4068-4d2e-9f7f-fb0a99338f33 req-0061ca9f-bcc7-4235-b0f1-0b108940c5b1 service nova] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Received event network-vif-unplugged-74703b46-6b03-4752-953b-9c64a63249c8 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:08:33 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-430cd2e1-4068-4d2e-9f7f-fb0a99338f33 req-0061ca9f-bcc7-4235-b0f1-0b108940c5b1 service nova] Acquiring lock "dc918ed4-8bc6-4a4f-a189-d6cdd5817854-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:33 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-430cd2e1-4068-4d2e-9f7f-fb0a99338f33 req-0061ca9f-bcc7-4235-b0f1-0b108940c5b1 service nova] Lock "dc918ed4-8bc6-4a4f-a189-d6cdd5817854-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:33 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-430cd2e1-4068-4d2e-9f7f-fb0a99338f33 req-0061ca9f-bcc7-4235-b0f1-0b108940c5b1 service nova] Lock "dc918ed4-8bc6-4a4f-a189-d6cdd5817854-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:33 user nova-compute[71605]: DEBUG nova.compute.manager [req-430cd2e1-4068-4d2e-9f7f-fb0a99338f33 req-0061ca9f-bcc7-4235-b0f1-0b108940c5b1 service nova] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] No waiting events found dispatching network-vif-unplugged-74703b46-6b03-4752-953b-9c64a63249c8 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:08:33 user nova-compute[71605]: DEBUG nova.compute.manager [req-430cd2e1-4068-4d2e-9f7f-fb0a99338f33 req-0061ca9f-bcc7-4235-b0f1-0b108940c5b1 service nova] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Received event network-vif-unplugged-74703b46-6b03-4752-953b-9c64a63249c8 for instance with task_state deleting. {{(pid=71605) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 16:08:33 user nova-compute[71605]: DEBUG nova.compute.manager [req-b36edb28-c674-438d-87ea-84ba5e0c57b9 req-51b7fdb2-b023-4f45-ad6a-702251a1dfc8 service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Received event network-vif-unplugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:08:33 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-b36edb28-c674-438d-87ea-84ba5e0c57b9 req-51b7fdb2-b023-4f45-ad6a-702251a1dfc8 service nova] Acquiring lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:33 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-b36edb28-c674-438d-87ea-84ba5e0c57b9 req-51b7fdb2-b023-4f45-ad6a-702251a1dfc8 service nova] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:33 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-b36edb28-c674-438d-87ea-84ba5e0c57b9 req-51b7fdb2-b023-4f45-ad6a-702251a1dfc8 service nova] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:33 user nova-compute[71605]: DEBUG nova.compute.manager [req-b36edb28-c674-438d-87ea-84ba5e0c57b9 req-51b7fdb2-b023-4f45-ad6a-702251a1dfc8 service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] No waiting events found dispatching network-vif-unplugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:08:33 user nova-compute[71605]: DEBUG nova.compute.manager [req-b36edb28-c674-438d-87ea-84ba5e0c57b9 req-51b7fdb2-b023-4f45-ad6a-702251a1dfc8 service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Received event network-vif-unplugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad for instance with task_state deleting. {{(pid=71605) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 16:08:34 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Instance destroyed successfully. Apr 20 16:08:34 user nova-compute[71605]: DEBUG nova.objects.instance [None req-56741221-d9d7-4f0e-8629-9a6021dc62c2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lazy-loading 'resources' on Instance uuid fe0bde76-a4f8-4865-91af-2bd3790587a7 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:08:34 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-56741221-d9d7-4f0e-8629-9a6021dc62c2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:03:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1422609663',display_name='tempest-ServerRescueNegativeTestJSON-server-1422609663',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1422609663',id=9,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-20T16:05:44Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='1d4a73ba128147f295bf6a4545fede47',ramdisk_id='',reservation_id='r-0526y6jk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerRescueNegativeTestJSON-237285916',owner_user_name='tempest-ServerRescueNegativeTestJSON-237285916-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T16:05:45Z,user_data=None,user_id='e51e637e06d1475692c4055ae99121da',uuid=fe0bde76-a4f8-4865-91af-2bd3790587a7,vcpu_model=,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "9f4d2191-16c0-4ab6-a4bd-f016499a9aad", "address": "fa:16:3e:dd:52:dd", "network": {"id": "4b5db782-8dbb-4f06-8e98-a794013dbc8c", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1330432693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d4a73ba128147f295bf6a4545fede47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f4d2191-16", "ovs_interfaceid": "9f4d2191-16c0-4ab6-a4bd-f016499a9aad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 16:08:34 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-56741221-d9d7-4f0e-8629-9a6021dc62c2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Converting VIF {"id": "9f4d2191-16c0-4ab6-a4bd-f016499a9aad", "address": "fa:16:3e:dd:52:dd", "network": {"id": "4b5db782-8dbb-4f06-8e98-a794013dbc8c", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1330432693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d4a73ba128147f295bf6a4545fede47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9f4d2191-16", "ovs_interfaceid": "9f4d2191-16c0-4ab6-a4bd-f016499a9aad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:08:34 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-56741221-d9d7-4f0e-8629-9a6021dc62c2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dd:52:dd,bridge_name='br-int',has_traffic_filtering=True,id=9f4d2191-16c0-4ab6-a4bd-f016499a9aad,network=Network(4b5db782-8dbb-4f06-8e98-a794013dbc8c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f4d2191-16') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:08:34 user nova-compute[71605]: DEBUG os_vif [None req-56741221-d9d7-4f0e-8629-9a6021dc62c2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:52:dd,bridge_name='br-int',has_traffic_filtering=True,id=9f4d2191-16c0-4ab6-a4bd-f016499a9aad,network=Network(4b5db782-8dbb-4f06-8e98-a794013dbc8c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f4d2191-16') {{(pid=71605) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 16:08:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9f4d2191-16, bridge=br-int, if_exists=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:08:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:34 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Instance destroyed successfully. Apr 20 16:08:34 user nova-compute[71605]: DEBUG nova.objects.instance [None req-ccc4485d-07c2-4101-b5af-b6e3e0fec6d4 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lazy-loading 'resources' on Instance uuid dc918ed4-8bc6-4a4f-a189-d6cdd5817854 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:08:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:08:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:34 user nova-compute[71605]: INFO os_vif [None req-56741221-d9d7-4f0e-8629-9a6021dc62c2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:52:dd,bridge_name='br-int',has_traffic_filtering=True,id=9f4d2191-16c0-4ab6-a4bd-f016499a9aad,network=Network(4b5db782-8dbb-4f06-8e98-a794013dbc8c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9f4d2191-16') Apr 20 16:08:34 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-56741221-d9d7-4f0e-8629-9a6021dc62c2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Deleting instance files /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7_del Apr 20 16:08:34 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-56741221-d9d7-4f0e-8629-9a6021dc62c2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Deletion of /opt/stack/data/nova/instances/fe0bde76-a4f8-4865-91af-2bd3790587a7_del complete Apr 20 16:08:34 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-ccc4485d-07c2-4101-b5af-b6e3e0fec6d4 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:03:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-247130899',display_name='tempest-VolumesAdminNegativeTest-server-247130899',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-247130899',id=10,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHvL2UEJPc4nJnMX0NLHsUyPpamaI8REueYO620VKU6jmG9moA3aOhnIV+8OJ4FygGtNs0JXD2mYZ/x6dT7j7bCftPAI8gs/5YWqGZxyEGNZggDwOTj0cc8sKDuS204Umw==',key_name='tempest-keypair-1405486078',keypairs=,launch_index=0,launched_at=2023-04-20T16:04:02Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='a92cea9e1182477ca669c506b42eda60',ramdisk_id='',reservation_id='r-0jl8eopp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesAdminNegativeTest-978356230',owner_user_name='tempest-VolumesAdminNegativeTest-978356230-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T16:04:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c92692a1d38b4531a4e7f42660a54c7b',uuid=dc918ed4-8bc6-4a4f-a189-d6cdd5817854,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "74703b46-6b03-4752-953b-9c64a63249c8", "address": "fa:16:3e:c5:94:d0", "network": {"id": "40132b20-6bfd-4f5a-8f6f-75769961d157", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-683065417-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.7", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a92cea9e1182477ca669c506b42eda60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap74703b46-6b", "ovs_interfaceid": "74703b46-6b03-4752-953b-9c64a63249c8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 16:08:34 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-ccc4485d-07c2-4101-b5af-b6e3e0fec6d4 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Converting VIF {"id": "74703b46-6b03-4752-953b-9c64a63249c8", "address": "fa:16:3e:c5:94:d0", "network": {"id": "40132b20-6bfd-4f5a-8f6f-75769961d157", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-683065417-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.7", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a92cea9e1182477ca669c506b42eda60", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap74703b46-6b", "ovs_interfaceid": "74703b46-6b03-4752-953b-9c64a63249c8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:08:34 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-ccc4485d-07c2-4101-b5af-b6e3e0fec6d4 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c5:94:d0,bridge_name='br-int',has_traffic_filtering=True,id=74703b46-6b03-4752-953b-9c64a63249c8,network=Network(40132b20-6bfd-4f5a-8f6f-75769961d157),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74703b46-6b') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:08:34 user nova-compute[71605]: DEBUG os_vif [None req-ccc4485d-07c2-4101-b5af-b6e3e0fec6d4 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c5:94:d0,bridge_name='br-int',has_traffic_filtering=True,id=74703b46-6b03-4752-953b-9c64a63249c8,network=Network(40132b20-6bfd-4f5a-8f6f-75769961d157),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74703b46-6b') {{(pid=71605) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 16:08:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74703b46-6b, bridge=br-int, if_exists=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:08:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:34 user nova-compute[71605]: INFO os_vif [None req-ccc4485d-07c2-4101-b5af-b6e3e0fec6d4 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c5:94:d0,bridge_name='br-int',has_traffic_filtering=True,id=74703b46-6b03-4752-953b-9c64a63249c8,network=Network(40132b20-6bfd-4f5a-8f6f-75769961d157),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74703b46-6b') Apr 20 16:08:34 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-ccc4485d-07c2-4101-b5af-b6e3e0fec6d4 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Deleting instance files /opt/stack/data/nova/instances/dc918ed4-8bc6-4a4f-a189-d6cdd5817854_del Apr 20 16:08:34 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-ccc4485d-07c2-4101-b5af-b6e3e0fec6d4 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Deletion of /opt/stack/data/nova/instances/dc918ed4-8bc6-4a4f-a189-d6cdd5817854_del complete Apr 20 16:08:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:34 user nova-compute[71605]: INFO nova.compute.manager [None req-56741221-d9d7-4f0e-8629-9a6021dc62c2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Took 1.33 seconds to destroy the instance on the hypervisor. Apr 20 16:08:34 user nova-compute[71605]: DEBUG oslo.service.loopingcall [None req-56741221-d9d7-4f0e-8629-9a6021dc62c2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71605) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 16:08:34 user nova-compute[71605]: DEBUG nova.compute.manager [-] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Deallocating network for instance {{(pid=71605) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 16:08:34 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] deallocate_for_instance() {{(pid=71605) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 16:08:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:34 user nova-compute[71605]: INFO nova.compute.manager [None req-ccc4485d-07c2-4101-b5af-b6e3e0fec6d4 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Took 1.29 seconds to destroy the instance on the hypervisor. Apr 20 16:08:34 user nova-compute[71605]: DEBUG oslo.service.loopingcall [None req-ccc4485d-07c2-4101-b5af-b6e3e0fec6d4 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71605) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 16:08:34 user nova-compute[71605]: DEBUG nova.compute.manager [-] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Deallocating network for instance {{(pid=71605) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 16:08:34 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] deallocate_for_instance() {{(pid=71605) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 16:08:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:35 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:08:35 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Took 1.01 seconds to deallocate network for instance. Apr 20 16:08:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-56741221-d9d7-4f0e-8629-9a6021dc62c2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-56741221-d9d7-4f0e-8629-9a6021dc62c2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:35 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:08:35 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Took 1.14 seconds to deallocate network for instance. Apr 20 16:08:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-ccc4485d-07c2-4101-b5af-b6e3e0fec6d4 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:35 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-56741221-d9d7-4f0e-8629-9a6021dc62c2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:08:35 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-56741221-d9d7-4f0e-8629-9a6021dc62c2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:08:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-56741221-d9d7-4f0e-8629-9a6021dc62c2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.297s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-ccc4485d-07c2-4101-b5af-b6e3e0fec6d4 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.098s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:35 user nova-compute[71605]: INFO nova.scheduler.client.report [None req-56741221-d9d7-4f0e-8629-9a6021dc62c2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Deleted allocations for instance fe0bde76-a4f8-4865-91af-2bd3790587a7 Apr 20 16:08:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-56741221-d9d7-4f0e-8629-9a6021dc62c2 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.843s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:35 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-ccc4485d-07c2-4101-b5af-b6e3e0fec6d4 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:08:35 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-ccc4485d-07c2-4101-b5af-b6e3e0fec6d4 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:08:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-ccc4485d-07c2-4101-b5af-b6e3e0fec6d4 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.271s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:35 user nova-compute[71605]: DEBUG nova.compute.manager [req-ad5c6a56-5a31-4660-b3c2-afdc1ad72cdd req-db778cad-3705-41d2-b3d4-478e4b149220 service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Received event network-vif-plugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:08:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-ad5c6a56-5a31-4660-b3c2-afdc1ad72cdd req-db778cad-3705-41d2-b3d4-478e4b149220 service nova] Acquiring lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-ad5c6a56-5a31-4660-b3c2-afdc1ad72cdd req-db778cad-3705-41d2-b3d4-478e4b149220 service nova] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-ad5c6a56-5a31-4660-b3c2-afdc1ad72cdd req-db778cad-3705-41d2-b3d4-478e4b149220 service nova] Lock "fe0bde76-a4f8-4865-91af-2bd3790587a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:35 user nova-compute[71605]: DEBUG nova.compute.manager [req-ad5c6a56-5a31-4660-b3c2-afdc1ad72cdd req-db778cad-3705-41d2-b3d4-478e4b149220 service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] No waiting events found dispatching network-vif-plugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:08:35 user nova-compute[71605]: WARNING nova.compute.manager [req-ad5c6a56-5a31-4660-b3c2-afdc1ad72cdd req-db778cad-3705-41d2-b3d4-478e4b149220 service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Received unexpected event network-vif-plugged-9f4d2191-16c0-4ab6-a4bd-f016499a9aad for instance with vm_state deleted and task_state None. Apr 20 16:08:35 user nova-compute[71605]: DEBUG nova.compute.manager [req-ad5c6a56-5a31-4660-b3c2-afdc1ad72cdd req-db778cad-3705-41d2-b3d4-478e4b149220 service nova] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Received event network-vif-deleted-9f4d2191-16c0-4ab6-a4bd-f016499a9aad {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:08:35 user nova-compute[71605]: DEBUG nova.compute.manager [req-ad5c6a56-5a31-4660-b3c2-afdc1ad72cdd req-db778cad-3705-41d2-b3d4-478e4b149220 service nova] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Received event network-vif-deleted-74703b46-6b03-4752-953b-9c64a63249c8 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:08:35 user nova-compute[71605]: INFO nova.scheduler.client.report [None req-ccc4485d-07c2-4101-b5af-b6e3e0fec6d4 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Deleted allocations for instance dc918ed4-8bc6-4a4f-a189-d6cdd5817854 Apr 20 16:08:35 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Resumed> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:08:35 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] VM Resumed (Lifecycle Event) Apr 20 16:08:35 user nova-compute[71605]: DEBUG nova.compute.manager [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Instance event wait completed in 0 seconds for {{(pid=71605) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 16:08:35 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Guest created on hypervisor {{(pid=71605) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 16:08:35 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Instance spawned successfully. Apr 20 16:08:35 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 16:08:35 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:08:35 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:08:35 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Found default for hw_cdrom_bus of ide {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:08:35 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Found default for hw_disk_bus of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:08:35 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Found default for hw_input_bus of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:08:35 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Found default for hw_pointer_model of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:08:35 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Found default for hw_video_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:08:35 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Found default for hw_vif_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:08:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-ccc4485d-07c2-4101-b5af-b6e3e0fec6d4 tempest-VolumesAdminNegativeTest-978356230 tempest-VolumesAdminNegativeTest-978356230-project-member] Lock "dc918ed4-8bc6-4a4f-a189-d6cdd5817854" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 3.007s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:35 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:08:35 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Started> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:08:35 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] VM Started (Lifecycle Event) Apr 20 16:08:35 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:08:35 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:08:35 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:08:36 user nova-compute[71605]: INFO nova.compute.manager [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Took 6.48 seconds to spawn the instance on the hypervisor. Apr 20 16:08:36 user nova-compute[71605]: DEBUG nova.compute.manager [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:08:36 user nova-compute[71605]: INFO nova.compute.manager [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Took 7.10 seconds to build instance. Apr 20 16:08:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-bd6fa3be-8657-4417-844e-c9c5eaec9e19 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "f6d19a54-ca7e-46fc-af21-6a7ddbc6604f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.187s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:36 user nova-compute[71605]: DEBUG nova.compute.manager [req-b31e41e5-13ab-45c2-8c8f-0dece4e8c099 req-5727c52e-0c35-4f02-ac7c-191863f39333 service nova] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Received event network-vif-plugged-74703b46-6b03-4752-953b-9c64a63249c8 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:08:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-b31e41e5-13ab-45c2-8c8f-0dece4e8c099 req-5727c52e-0c35-4f02-ac7c-191863f39333 service nova] Acquiring lock "dc918ed4-8bc6-4a4f-a189-d6cdd5817854-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-b31e41e5-13ab-45c2-8c8f-0dece4e8c099 req-5727c52e-0c35-4f02-ac7c-191863f39333 service nova] Lock "dc918ed4-8bc6-4a4f-a189-d6cdd5817854-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-b31e41e5-13ab-45c2-8c8f-0dece4e8c099 req-5727c52e-0c35-4f02-ac7c-191863f39333 service nova] Lock "dc918ed4-8bc6-4a4f-a189-d6cdd5817854-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:36 user nova-compute[71605]: DEBUG nova.compute.manager [req-b31e41e5-13ab-45c2-8c8f-0dece4e8c099 req-5727c52e-0c35-4f02-ac7c-191863f39333 service nova] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] No waiting events found dispatching network-vif-plugged-74703b46-6b03-4752-953b-9c64a63249c8 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:08:36 user nova-compute[71605]: WARNING nova.compute.manager [req-b31e41e5-13ab-45c2-8c8f-0dece4e8c099 req-5727c52e-0c35-4f02-ac7c-191863f39333 service nova] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Received unexpected event network-vif-plugged-74703b46-6b03-4752-953b-9c64a63249c8 for instance with vm_state deleted and task_state None. Apr 20 16:08:36 user nova-compute[71605]: DEBUG nova.compute.manager [req-b31e41e5-13ab-45c2-8c8f-0dece4e8c099 req-5727c52e-0c35-4f02-ac7c-191863f39333 service nova] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Received event network-vif-plugged-9223f738-4299-44ed-8e8f-c39e3353e39d {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:08:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-b31e41e5-13ab-45c2-8c8f-0dece4e8c099 req-5727c52e-0c35-4f02-ac7c-191863f39333 service nova] Acquiring lock "f6d19a54-ca7e-46fc-af21-6a7ddbc6604f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-b31e41e5-13ab-45c2-8c8f-0dece4e8c099 req-5727c52e-0c35-4f02-ac7c-191863f39333 service nova] Lock "f6d19a54-ca7e-46fc-af21-6a7ddbc6604f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-b31e41e5-13ab-45c2-8c8f-0dece4e8c099 req-5727c52e-0c35-4f02-ac7c-191863f39333 service nova] Lock "f6d19a54-ca7e-46fc-af21-6a7ddbc6604f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:36 user nova-compute[71605]: DEBUG nova.compute.manager [req-b31e41e5-13ab-45c2-8c8f-0dece4e8c099 req-5727c52e-0c35-4f02-ac7c-191863f39333 service nova] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] No waiting events found dispatching network-vif-plugged-9223f738-4299-44ed-8e8f-c39e3353e39d {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:08:36 user nova-compute[71605]: WARNING nova.compute.manager [req-b31e41e5-13ab-45c2-8c8f-0dece4e8c099 req-5727c52e-0c35-4f02-ac7c-191863f39333 service nova] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Received unexpected event network-vif-plugged-9223f738-4299-44ed-8e8f-c39e3353e39d for instance with vm_state active and task_state None. Apr 20 16:08:36 user nova-compute[71605]: DEBUG nova.compute.manager [req-b31e41e5-13ab-45c2-8c8f-0dece4e8c099 req-5727c52e-0c35-4f02-ac7c-191863f39333 service nova] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Received event network-vif-plugged-9223f738-4299-44ed-8e8f-c39e3353e39d {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:08:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-b31e41e5-13ab-45c2-8c8f-0dece4e8c099 req-5727c52e-0c35-4f02-ac7c-191863f39333 service nova] Acquiring lock "f6d19a54-ca7e-46fc-af21-6a7ddbc6604f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-b31e41e5-13ab-45c2-8c8f-0dece4e8c099 req-5727c52e-0c35-4f02-ac7c-191863f39333 service nova] Lock "f6d19a54-ca7e-46fc-af21-6a7ddbc6604f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-b31e41e5-13ab-45c2-8c8f-0dece4e8c099 req-5727c52e-0c35-4f02-ac7c-191863f39333 service nova] Lock "f6d19a54-ca7e-46fc-af21-6a7ddbc6604f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:36 user nova-compute[71605]: DEBUG nova.compute.manager [req-b31e41e5-13ab-45c2-8c8f-0dece4e8c099 req-5727c52e-0c35-4f02-ac7c-191863f39333 service nova] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] No waiting events found dispatching network-vif-plugged-9223f738-4299-44ed-8e8f-c39e3353e39d {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:08:36 user nova-compute[71605]: WARNING nova.compute.manager [req-b31e41e5-13ab-45c2-8c8f-0dece4e8c099 req-5727c52e-0c35-4f02-ac7c-191863f39333 service nova] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Received unexpected event network-vif-plugged-9223f738-4299-44ed-8e8f-c39e3353e39d for instance with vm_state active and task_state None. Apr 20 16:08:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:40 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:08:40 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Cleaning up deleted instances with incomplete migration {{(pid=71605) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 20 16:08:41 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:08:41 user nova-compute[71605]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:08:41 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] VM Stopped (Lifecycle Event) Apr 20 16:08:41 user nova-compute[71605]: DEBUG nova.compute.manager [None req-050a1d47-18a5-4386-a2b2-5f99ae4878cb None None] [instance: 3ac0a246-e2fe-4164-9bc1-c96bb94e396f] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:08:42 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:08:42 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:08:42 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:08:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:42 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:08:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:08:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/disk --force-share --output=json" returned: 0 in 0.153s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:08:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:08:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b/disk --force-share --output=json" returned: 0 in 0.156s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:08:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4f841186-7958-4642-9050-9b048b61ebbb/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:08:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4f841186-7958-4642-9050-9b048b61ebbb/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:08:42 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4f841186-7958-4642-9050-9b048b61ebbb/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:08:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4f841186-7958-4642-9050-9b048b61ebbb/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:08:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:08:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:08:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:08:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:08:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f6d19a54-ca7e-46fc-af21-6a7ddbc6604f/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:08:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f6d19a54-ca7e-46fc-af21-6a7ddbc6604f/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:08:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f6d19a54-ca7e-46fc-af21-6a7ddbc6604f/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:08:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f6d19a54-ca7e-46fc-af21-6a7ddbc6604f/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:08:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:08:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:08:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:08:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:08:44 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:44 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:08:44 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:08:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=8411MB free_disk=26.354736328125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:08:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance d4ea4d29-b178-4da2-b971-76f97031b244 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:08:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance e8f62d46-e2dc-4870-adf1-f62d88bb653b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:08:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance 15d42ba7-cf47-4374-83b5-06d5242951b7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:08:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance 4f841186-7958-4642-9050-9b048b61ebbb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:08:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance f6d19a54-ca7e-46fc-af21-6a7ddbc6604f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:08:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 5 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:08:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=1152MB phys_disk=40GB used_disk=5GB total_vcpus=12 used_vcpus=5 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:08:44 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:08:44 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:08:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:08:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.317s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:44 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:08:44 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Cleaning up deleted instances {{(pid=71605) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 20 16:08:44 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] There are 0 instances to clean {{(pid=71605) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 20 16:08:44 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:08:45 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:08:45 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:08:45 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:08:45 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:08:45 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:08:46 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:08:46 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:08:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "refresh_cache-e8f62d46-e2dc-4870-adf1-f62d88bb653b" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:08:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquired lock "refresh_cache-e8f62d46-e2dc-4870-adf1-f62d88bb653b" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:08:46 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Forcefully refreshing network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 16:08:46 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Updating instance_info_cache with network_info: [{"id": "8200d42f-0f8f-439d-8ea8-1eea4fba54d6", "address": "fa:16:3e:6d:26:c0", "network": {"id": "4b5db782-8dbb-4f06-8e98-a794013dbc8c", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1330432693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d4a73ba128147f295bf6a4545fede47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8200d42f-0f", "ovs_interfaceid": "8200d42f-0f8f-439d-8ea8-1eea4fba54d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:08:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Releasing lock "refresh_cache-e8f62d46-e2dc-4870-adf1-f62d88bb653b" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:08:46 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Updated the network info_cache for instance {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 16:08:46 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:08:49 user nova-compute[71605]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:08:49 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] VM Stopped (Lifecycle Event) Apr 20 16:08:49 user nova-compute[71605]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:08:49 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] VM Stopped (Lifecycle Event) Apr 20 16:08:49 user nova-compute[71605]: DEBUG nova.compute.manager [None req-2a8fd25f-6c41-484b-9da5-7bc30868c4fe None None] [instance: fe0bde76-a4f8-4865-91af-2bd3790587a7] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:08:49 user nova-compute[71605]: DEBUG nova.compute.manager [None req-973fea72-4795-4133-bdeb-e1d202e4d1ab None None] [instance: dc918ed4-8bc6-4a4f-a189-d6cdd5817854] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:08:49 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:08:49 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:08:57 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._sync_power_states {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:08:57 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Triggering sync for uuid d4ea4d29-b178-4da2-b971-76f97031b244 {{(pid=71605) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 20 16:08:57 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Triggering sync for uuid e8f62d46-e2dc-4870-adf1-f62d88bb653b {{(pid=71605) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 20 16:08:57 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Triggering sync for uuid 15d42ba7-cf47-4374-83b5-06d5242951b7 {{(pid=71605) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 20 16:08:57 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Triggering sync for uuid 4f841186-7958-4642-9050-9b048b61ebbb {{(pid=71605) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 20 16:08:57 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Triggering sync for uuid f6d19a54-ca7e-46fc-af21-6a7ddbc6604f {{(pid=71605) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 20 16:08:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "d4ea4d29-b178-4da2-b971-76f97031b244" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "d4ea4d29-b178-4da2-b971-76f97031b244" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "e8f62d46-e2dc-4870-adf1-f62d88bb653b" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "e8f62d46-e2dc-4870-adf1-f62d88bb653b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "15d42ba7-cf47-4374-83b5-06d5242951b7" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "15d42ba7-cf47-4374-83b5-06d5242951b7" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "4f841186-7958-4642-9050-9b048b61ebbb" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "4f841186-7958-4642-9050-9b048b61ebbb" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "f6d19a54-ca7e-46fc-af21-6a7ddbc6604f" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:08:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "f6d19a54-ca7e-46fc-af21-6a7ddbc6604f" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:08:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "e8f62d46-e2dc-4870-adf1-f62d88bb653b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.057s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "15d42ba7-cf47-4374-83b5-06d5242951b7" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.056s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "f6d19a54-ca7e-46fc-af21-6a7ddbc6604f" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.059s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "d4ea4d29-b178-4da2-b971-76f97031b244" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.069s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "4f841186-7958-4642-9050-9b048b61ebbb" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.084s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:08:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:04 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:04 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:09 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:14 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:09:14 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:19 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:19 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:19 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:24 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-13d0846d-7649-4508-9305-968e0510edd1 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Acquiring lock "e8f62d46-e2dc-4870-adf1-f62d88bb653b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:09:24 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-13d0846d-7649-4508-9305-968e0510edd1 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "e8f62d46-e2dc-4870-adf1-f62d88bb653b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:09:24 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-13d0846d-7649-4508-9305-968e0510edd1 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Acquiring lock "e8f62d46-e2dc-4870-adf1-f62d88bb653b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:09:24 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-13d0846d-7649-4508-9305-968e0510edd1 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "e8f62d46-e2dc-4870-adf1-f62d88bb653b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:09:24 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-13d0846d-7649-4508-9305-968e0510edd1 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "e8f62d46-e2dc-4870-adf1-f62d88bb653b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:09:24 user nova-compute[71605]: INFO nova.compute.manager [None req-13d0846d-7649-4508-9305-968e0510edd1 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Terminating instance Apr 20 16:09:24 user nova-compute[71605]: DEBUG nova.compute.manager [None req-13d0846d-7649-4508-9305-968e0510edd1 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Start destroying the instance on the hypervisor. {{(pid=71605) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 16:09:24 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:24 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:24 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:24 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:24 user nova-compute[71605]: DEBUG nova.compute.manager [req-7befd85a-ff75-4cd0-a053-551b27ef3074 req-ae2060fc-a67f-4393-9fb9-90a29e933b6d service nova] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Received event network-vif-unplugged-8200d42f-0f8f-439d-8ea8-1eea4fba54d6 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:09:24 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-7befd85a-ff75-4cd0-a053-551b27ef3074 req-ae2060fc-a67f-4393-9fb9-90a29e933b6d service nova] Acquiring lock "e8f62d46-e2dc-4870-adf1-f62d88bb653b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:09:24 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-7befd85a-ff75-4cd0-a053-551b27ef3074 req-ae2060fc-a67f-4393-9fb9-90a29e933b6d service nova] Lock "e8f62d46-e2dc-4870-adf1-f62d88bb653b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:09:24 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-7befd85a-ff75-4cd0-a053-551b27ef3074 req-ae2060fc-a67f-4393-9fb9-90a29e933b6d service nova] Lock "e8f62d46-e2dc-4870-adf1-f62d88bb653b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:09:24 user nova-compute[71605]: DEBUG nova.compute.manager [req-7befd85a-ff75-4cd0-a053-551b27ef3074 req-ae2060fc-a67f-4393-9fb9-90a29e933b6d service nova] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] No waiting events found dispatching network-vif-unplugged-8200d42f-0f8f-439d-8ea8-1eea4fba54d6 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:09:24 user nova-compute[71605]: DEBUG nova.compute.manager [req-7befd85a-ff75-4cd0-a053-551b27ef3074 req-ae2060fc-a67f-4393-9fb9-90a29e933b6d service nova] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Received event network-vif-unplugged-8200d42f-0f8f-439d-8ea8-1eea4fba54d6 for instance with task_state deleting. {{(pid=71605) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 16:09:24 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:24 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Instance destroyed successfully. Apr 20 16:09:24 user nova-compute[71605]: DEBUG nova.objects.instance [None req-13d0846d-7649-4508-9305-968e0510edd1 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lazy-loading 'resources' on Instance uuid e8f62d46-e2dc-4870-adf1-f62d88bb653b {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:09:24 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-13d0846d-7649-4508-9305-968e0510edd1 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:03:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-2146872314',display_name='tempest-ServerRescueNegativeTestJSON-server-2146872314',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-2146872314',id=8,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-20T16:03:58Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='1d4a73ba128147f295bf6a4545fede47',ramdisk_id='',reservation_id='r-oxzlvpr2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerRescueNegativeTestJSON-237285916',owner_user_name='tempest-ServerRescueNegativeTestJSON-237285916-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T16:03:59Z,user_data=None,user_id='e51e637e06d1475692c4055ae99121da',uuid=e8f62d46-e2dc-4870-adf1-f62d88bb653b,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8200d42f-0f8f-439d-8ea8-1eea4fba54d6", "address": "fa:16:3e:6d:26:c0", "network": {"id": "4b5db782-8dbb-4f06-8e98-a794013dbc8c", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1330432693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d4a73ba128147f295bf6a4545fede47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8200d42f-0f", "ovs_interfaceid": "8200d42f-0f8f-439d-8ea8-1eea4fba54d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 16:09:24 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-13d0846d-7649-4508-9305-968e0510edd1 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Converting VIF {"id": "8200d42f-0f8f-439d-8ea8-1eea4fba54d6", "address": "fa:16:3e:6d:26:c0", "network": {"id": "4b5db782-8dbb-4f06-8e98-a794013dbc8c", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-1330432693-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1d4a73ba128147f295bf6a4545fede47", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap8200d42f-0f", "ovs_interfaceid": "8200d42f-0f8f-439d-8ea8-1eea4fba54d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:09:24 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-13d0846d-7649-4508-9305-968e0510edd1 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6d:26:c0,bridge_name='br-int',has_traffic_filtering=True,id=8200d42f-0f8f-439d-8ea8-1eea4fba54d6,network=Network(4b5db782-8dbb-4f06-8e98-a794013dbc8c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8200d42f-0f') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:09:24 user nova-compute[71605]: DEBUG os_vif [None req-13d0846d-7649-4508-9305-968e0510edd1 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6d:26:c0,bridge_name='br-int',has_traffic_filtering=True,id=8200d42f-0f8f-439d-8ea8-1eea4fba54d6,network=Network(4b5db782-8dbb-4f06-8e98-a794013dbc8c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8200d42f-0f') {{(pid=71605) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 16:09:24 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:24 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8200d42f-0f, bridge=br-int, if_exists=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:09:24 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:24 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:09:24 user nova-compute[71605]: INFO os_vif [None req-13d0846d-7649-4508-9305-968e0510edd1 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6d:26:c0,bridge_name='br-int',has_traffic_filtering=True,id=8200d42f-0f8f-439d-8ea8-1eea4fba54d6,network=Network(4b5db782-8dbb-4f06-8e98-a794013dbc8c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8200d42f-0f') Apr 20 16:09:24 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-13d0846d-7649-4508-9305-968e0510edd1 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Deleting instance files /opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b_del Apr 20 16:09:24 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-13d0846d-7649-4508-9305-968e0510edd1 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Deletion of /opt/stack/data/nova/instances/e8f62d46-e2dc-4870-adf1-f62d88bb653b_del complete Apr 20 16:09:24 user nova-compute[71605]: INFO nova.compute.manager [None req-13d0846d-7649-4508-9305-968e0510edd1 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Took 0.86 seconds to destroy the instance on the hypervisor. Apr 20 16:09:24 user nova-compute[71605]: DEBUG oslo.service.loopingcall [None req-13d0846d-7649-4508-9305-968e0510edd1 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71605) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 16:09:24 user nova-compute[71605]: DEBUG nova.compute.manager [-] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Deallocating network for instance {{(pid=71605) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 16:09:24 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] deallocate_for_instance() {{(pid=71605) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 16:09:25 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:09:25 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Took 0.50 seconds to deallocate network for instance. Apr 20 16:09:25 user nova-compute[71605]: DEBUG nova.compute.manager [req-20cc6a2d-ea00-425a-9e9e-4edade0bfd4d req-bdc020d8-b104-43df-aee7-56d72e4b4330 service nova] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Received event network-vif-deleted-8200d42f-0f8f-439d-8ea8-1eea4fba54d6 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:09:25 user nova-compute[71605]: INFO nova.compute.manager [req-20cc6a2d-ea00-425a-9e9e-4edade0bfd4d req-bdc020d8-b104-43df-aee7-56d72e4b4330 service nova] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Neutron deleted interface 8200d42f-0f8f-439d-8ea8-1eea4fba54d6; detaching it from the instance and deleting it from the info cache Apr 20 16:09:25 user nova-compute[71605]: DEBUG nova.network.neutron [req-20cc6a2d-ea00-425a-9e9e-4edade0bfd4d req-bdc020d8-b104-43df-aee7-56d72e4b4330 service nova] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:09:25 user nova-compute[71605]: DEBUG nova.compute.manager [req-20cc6a2d-ea00-425a-9e9e-4edade0bfd4d req-bdc020d8-b104-43df-aee7-56d72e4b4330 service nova] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Detach interface failed, port_id=8200d42f-0f8f-439d-8ea8-1eea4fba54d6, reason: Instance e8f62d46-e2dc-4870-adf1-f62d88bb653b could not be found. {{(pid=71605) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 20 16:09:25 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-13d0846d-7649-4508-9305-968e0510edd1 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:09:25 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-13d0846d-7649-4508-9305-968e0510edd1 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:09:25 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-13d0846d-7649-4508-9305-968e0510edd1 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Refreshing inventories for resource provider 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 20 16:09:25 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-13d0846d-7649-4508-9305-968e0510edd1 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Updating ProviderTree inventory for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 20 16:09:25 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-13d0846d-7649-4508-9305-968e0510edd1 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Updating inventory in ProviderTree for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 20 16:09:25 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-13d0846d-7649-4508-9305-968e0510edd1 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Refreshing aggregate associations for resource provider 00e9f769-1a1c-4f1e-80e4-b19657803102, aggregates: None {{(pid=71605) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 20 16:09:25 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-13d0846d-7649-4508-9305-968e0510edd1 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Refreshing trait associations for resource provider 00e9f769-1a1c-4f1e-80e4-b19657803102, traits: COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_MMX,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI {{(pid=71605) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 20 16:09:26 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-13d0846d-7649-4508-9305-968e0510edd1 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:09:26 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-13d0846d-7649-4508-9305-968e0510edd1 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:09:26 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-13d0846d-7649-4508-9305-968e0510edd1 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.568s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:09:26 user nova-compute[71605]: INFO nova.scheduler.client.report [None req-13d0846d-7649-4508-9305-968e0510edd1 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Deleted allocations for instance e8f62d46-e2dc-4870-adf1-f62d88bb653b Apr 20 16:09:26 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-13d0846d-7649-4508-9305-968e0510edd1 tempest-ServerRescueNegativeTestJSON-237285916 tempest-ServerRescueNegativeTestJSON-237285916-project-member] Lock "e8f62d46-e2dc-4870-adf1-f62d88bb653b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.104s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:09:26 user nova-compute[71605]: DEBUG nova.compute.manager [req-662c3a56-1c28-49ac-94eb-ead3e1ef73ea req-9d078075-b66a-441d-bca6-e3a981bc4666 service nova] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Received event network-vif-plugged-8200d42f-0f8f-439d-8ea8-1eea4fba54d6 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:09:26 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-662c3a56-1c28-49ac-94eb-ead3e1ef73ea req-9d078075-b66a-441d-bca6-e3a981bc4666 service nova] Acquiring lock "e8f62d46-e2dc-4870-adf1-f62d88bb653b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:09:26 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-662c3a56-1c28-49ac-94eb-ead3e1ef73ea req-9d078075-b66a-441d-bca6-e3a981bc4666 service nova] Lock "e8f62d46-e2dc-4870-adf1-f62d88bb653b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:09:26 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-662c3a56-1c28-49ac-94eb-ead3e1ef73ea req-9d078075-b66a-441d-bca6-e3a981bc4666 service nova] Lock "e8f62d46-e2dc-4870-adf1-f62d88bb653b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:09:26 user nova-compute[71605]: DEBUG nova.compute.manager [req-662c3a56-1c28-49ac-94eb-ead3e1ef73ea req-9d078075-b66a-441d-bca6-e3a981bc4666 service nova] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] No waiting events found dispatching network-vif-plugged-8200d42f-0f8f-439d-8ea8-1eea4fba54d6 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:09:26 user nova-compute[71605]: WARNING nova.compute.manager [req-662c3a56-1c28-49ac-94eb-ead3e1ef73ea req-9d078075-b66a-441d-bca6-e3a981bc4666 service nova] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Received unexpected event network-vif-plugged-8200d42f-0f8f-439d-8ea8-1eea4fba54d6 for instance with vm_state deleted and task_state None. Apr 20 16:09:27 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:29 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:30 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:39 user nova-compute[71605]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:09:39 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] VM Stopped (Lifecycle Event) Apr 20 16:09:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:09:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:09:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:09:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:09:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:39 user nova-compute[71605]: DEBUG nova.compute.manager [None req-60a842ca-74c3-4d6a-be42-5db3ac630f3e None None] [instance: e8f62d46-e2dc-4870-adf1-f62d88bb653b] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:09:41 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:09:42 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:09:42 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:09:43 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:09:43 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:09:43 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:09:43 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:09:43 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:09:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4f841186-7958-4642-9050-9b048b61ebbb/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:09:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4f841186-7958-4642-9050-9b048b61ebbb/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:09:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4f841186-7958-4642-9050-9b048b61ebbb/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:09:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4f841186-7958-4642-9050-9b048b61ebbb/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:09:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:09:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:09:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:09:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:09:43 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f6d19a54-ca7e-46fc-af21-6a7ddbc6604f/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:09:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f6d19a54-ca7e-46fc-af21-6a7ddbc6604f/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:09:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f6d19a54-ca7e-46fc-af21-6a7ddbc6604f/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:09:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/f6d19a54-ca7e-46fc-af21-6a7ddbc6604f/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:09:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:09:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:09:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:09:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:09:44 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:44 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:44 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:09:44 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:09:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=8585MB free_disk=26.349700927734375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:09:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:09:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:09:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance d4ea4d29-b178-4da2-b971-76f97031b244 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:09:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance 15d42ba7-cf47-4374-83b5-06d5242951b7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:09:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance 4f841186-7958-4642-9050-9b048b61ebbb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:09:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance f6d19a54-ca7e-46fc-af21-6a7ddbc6604f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:09:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 4 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:09:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=1024MB phys_disk=40GB used_disk=4GB total_vcpus=12 used_vcpus=4 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:09:45 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:09:45 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:09:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:09:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.282s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:09:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-11904068-9727-4a8c-9102-2b81b52d85b9 tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Acquiring lock "4f841186-7958-4642-9050-9b048b61ebbb" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:09:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-11904068-9727-4a8c-9102-2b81b52d85b9 tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Lock "4f841186-7958-4642-9050-9b048b61ebbb" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.002s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:09:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-11904068-9727-4a8c-9102-2b81b52d85b9 tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Acquiring lock "4f841186-7958-4642-9050-9b048b61ebbb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:09:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-11904068-9727-4a8c-9102-2b81b52d85b9 tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Lock "4f841186-7958-4642-9050-9b048b61ebbb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:09:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-11904068-9727-4a8c-9102-2b81b52d85b9 tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Lock "4f841186-7958-4642-9050-9b048b61ebbb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:09:45 user nova-compute[71605]: INFO nova.compute.manager [None req-11904068-9727-4a8c-9102-2b81b52d85b9 tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Terminating instance Apr 20 16:09:45 user nova-compute[71605]: DEBUG nova.compute.manager [None req-11904068-9727-4a8c-9102-2b81b52d85b9 tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Start destroying the instance on the hypervisor. {{(pid=71605) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 16:09:45 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:45 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:45 user nova-compute[71605]: DEBUG nova.compute.manager [req-c8d5e0a9-9d27-4bb7-92e9-89c447d3a868 req-64add8bb-558e-423a-80d6-22daf8610cb9 service nova] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Received event network-vif-unplugged-fe7ac99b-2b51-4ae2-9903-9ae286328c8b {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:09:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-c8d5e0a9-9d27-4bb7-92e9-89c447d3a868 req-64add8bb-558e-423a-80d6-22daf8610cb9 service nova] Acquiring lock "4f841186-7958-4642-9050-9b048b61ebbb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:09:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-c8d5e0a9-9d27-4bb7-92e9-89c447d3a868 req-64add8bb-558e-423a-80d6-22daf8610cb9 service nova] Lock "4f841186-7958-4642-9050-9b048b61ebbb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:09:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-c8d5e0a9-9d27-4bb7-92e9-89c447d3a868 req-64add8bb-558e-423a-80d6-22daf8610cb9 service nova] Lock "4f841186-7958-4642-9050-9b048b61ebbb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:09:45 user nova-compute[71605]: DEBUG nova.compute.manager [req-c8d5e0a9-9d27-4bb7-92e9-89c447d3a868 req-64add8bb-558e-423a-80d6-22daf8610cb9 service nova] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] No waiting events found dispatching network-vif-unplugged-fe7ac99b-2b51-4ae2-9903-9ae286328c8b {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:09:45 user nova-compute[71605]: DEBUG nova.compute.manager [req-c8d5e0a9-9d27-4bb7-92e9-89c447d3a868 req-64add8bb-558e-423a-80d6-22daf8610cb9 service nova] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Received event network-vif-unplugged-fe7ac99b-2b51-4ae2-9903-9ae286328c8b for instance with task_state deleting. {{(pid=71605) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 16:09:46 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Instance destroyed successfully. Apr 20 16:09:46 user nova-compute[71605]: DEBUG nova.objects.instance [None req-11904068-9727-4a8c-9102-2b81b52d85b9 tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Lazy-loading 'resources' on Instance uuid 4f841186-7958-4642-9050-9b048b61ebbb {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:09:46 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-11904068-9727-4a8c-9102-2b81b52d85b9 tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:07:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-884360922',display_name='tempest-VolumesActionsTest-instance-884360922',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-884360922',id=17,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-20T16:08:00Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='ec92faae7a5d40f98409e9634a9dbf9b',ramdisk_id='',reservation_id='r-yae0i2gd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesActionsTest-1745644681',owner_user_name='tempest-VolumesActionsTest-1745644681-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T16:08:00Z,user_data=None,user_id='403bee038ece4e9aa023aad83ee8f188',uuid=4f841186-7958-4642-9050-9b048b61ebbb,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fe7ac99b-2b51-4ae2-9903-9ae286328c8b", "address": "fa:16:3e:6b:e1:59", "network": {"id": "6deca2c3-6467-44e2-aa30-6cf5abf230f5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1683519822-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "ec92faae7a5d40f98409e9634a9dbf9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe7ac99b-2b", "ovs_interfaceid": "fe7ac99b-2b51-4ae2-9903-9ae286328c8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 16:09:46 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-11904068-9727-4a8c-9102-2b81b52d85b9 tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Converting VIF {"id": "fe7ac99b-2b51-4ae2-9903-9ae286328c8b", "address": "fa:16:3e:6b:e1:59", "network": {"id": "6deca2c3-6467-44e2-aa30-6cf5abf230f5", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1683519822-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.2", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "ec92faae7a5d40f98409e9634a9dbf9b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe7ac99b-2b", "ovs_interfaceid": "fe7ac99b-2b51-4ae2-9903-9ae286328c8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:09:46 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-11904068-9727-4a8c-9102-2b81b52d85b9 tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:e1:59,bridge_name='br-int',has_traffic_filtering=True,id=fe7ac99b-2b51-4ae2-9903-9ae286328c8b,network=Network(6deca2c3-6467-44e2-aa30-6cf5abf230f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe7ac99b-2b') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:09:46 user nova-compute[71605]: DEBUG os_vif [None req-11904068-9727-4a8c-9102-2b81b52d85b9 tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:e1:59,bridge_name='br-int',has_traffic_filtering=True,id=fe7ac99b-2b51-4ae2-9903-9ae286328c8b,network=Network(6deca2c3-6467-44e2-aa30-6cf5abf230f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe7ac99b-2b') {{(pid=71605) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 16:09:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe7ac99b-2b, bridge=br-int, if_exists=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:09:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:09:46 user nova-compute[71605]: INFO os_vif [None req-11904068-9727-4a8c-9102-2b81b52d85b9 tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:e1:59,bridge_name='br-int',has_traffic_filtering=True,id=fe7ac99b-2b51-4ae2-9903-9ae286328c8b,network=Network(6deca2c3-6467-44e2-aa30-6cf5abf230f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe7ac99b-2b') Apr 20 16:09:46 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-11904068-9727-4a8c-9102-2b81b52d85b9 tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Deleting instance files /opt/stack/data/nova/instances/4f841186-7958-4642-9050-9b048b61ebbb_del Apr 20 16:09:46 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-11904068-9727-4a8c-9102-2b81b52d85b9 tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Deletion of /opt/stack/data/nova/instances/4f841186-7958-4642-9050-9b048b61ebbb_del complete Apr 20 16:09:46 user nova-compute[71605]: INFO nova.compute.manager [None req-11904068-9727-4a8c-9102-2b81b52d85b9 tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Took 0.67 seconds to destroy the instance on the hypervisor. Apr 20 16:09:46 user nova-compute[71605]: DEBUG oslo.service.loopingcall [None req-11904068-9727-4a8c-9102-2b81b52d85b9 tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71605) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 16:09:46 user nova-compute[71605]: DEBUG nova.compute.manager [-] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Deallocating network for instance {{(pid=71605) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 16:09:46 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] deallocate_for_instance() {{(pid=71605) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 16:09:46 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:09:46 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Took 0.66 seconds to deallocate network for instance. Apr 20 16:09:46 user nova-compute[71605]: DEBUG nova.compute.manager [req-a4dd685e-85aa-4ecc-ab3e-fc6d47d88056 req-b93ae442-a6ae-4eeb-bfea-6438f1790c21 service nova] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Received event network-vif-deleted-fe7ac99b-2b51-4ae2-9903-9ae286328c8b {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:09:46 user nova-compute[71605]: INFO nova.compute.manager [req-a4dd685e-85aa-4ecc-ab3e-fc6d47d88056 req-b93ae442-a6ae-4eeb-bfea-6438f1790c21 service nova] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Neutron deleted interface fe7ac99b-2b51-4ae2-9903-9ae286328c8b; detaching it from the instance and deleting it from the info cache Apr 20 16:09:46 user nova-compute[71605]: DEBUG nova.network.neutron [req-a4dd685e-85aa-4ecc-ab3e-fc6d47d88056 req-b93ae442-a6ae-4eeb-bfea-6438f1790c21 service nova] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:09:46 user nova-compute[71605]: DEBUG nova.compute.manager [req-a4dd685e-85aa-4ecc-ab3e-fc6d47d88056 req-b93ae442-a6ae-4eeb-bfea-6438f1790c21 service nova] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Detach interface failed, port_id=fe7ac99b-2b51-4ae2-9903-9ae286328c8b, reason: Instance 4f841186-7958-4642-9050-9b048b61ebbb could not be found. {{(pid=71605) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 20 16:09:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-11904068-9727-4a8c-9102-2b81b52d85b9 tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:09:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-11904068-9727-4a8c-9102-2b81b52d85b9 tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:09:47 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-11904068-9727-4a8c-9102-2b81b52d85b9 tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:09:47 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-11904068-9727-4a8c-9102-2b81b52d85b9 tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:09:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-11904068-9727-4a8c-9102-2b81b52d85b9 tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.189s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:09:47 user nova-compute[71605]: INFO nova.scheduler.client.report [None req-11904068-9727-4a8c-9102-2b81b52d85b9 tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Deleted allocations for instance 4f841186-7958-4642-9050-9b048b61ebbb Apr 20 16:09:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-11904068-9727-4a8c-9102-2b81b52d85b9 tempest-VolumesActionsTest-1745644681 tempest-VolumesActionsTest-1745644681-project-member] Lock "4f841186-7958-4642-9050-9b048b61ebbb" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.719s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:09:47 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:09:47 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:09:47 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:09:47 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:09:47 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:09:47 user nova-compute[71605]: DEBUG nova.compute.manager [req-e4b13943-a77c-46e9-9349-ef51c54b5031 req-3d5ce21d-f880-4b56-9882-89779bc314b4 service nova] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Received event network-vif-plugged-fe7ac99b-2b51-4ae2-9903-9ae286328c8b {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:09:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-e4b13943-a77c-46e9-9349-ef51c54b5031 req-3d5ce21d-f880-4b56-9882-89779bc314b4 service nova] Acquiring lock "4f841186-7958-4642-9050-9b048b61ebbb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:09:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-e4b13943-a77c-46e9-9349-ef51c54b5031 req-3d5ce21d-f880-4b56-9882-89779bc314b4 service nova] Lock "4f841186-7958-4642-9050-9b048b61ebbb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:09:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-e4b13943-a77c-46e9-9349-ef51c54b5031 req-3d5ce21d-f880-4b56-9882-89779bc314b4 service nova] Lock "4f841186-7958-4642-9050-9b048b61ebbb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:09:47 user nova-compute[71605]: DEBUG nova.compute.manager [req-e4b13943-a77c-46e9-9349-ef51c54b5031 req-3d5ce21d-f880-4b56-9882-89779bc314b4 service nova] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] No waiting events found dispatching network-vif-plugged-fe7ac99b-2b51-4ae2-9903-9ae286328c8b {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:09:47 user nova-compute[71605]: WARNING nova.compute.manager [req-e4b13943-a77c-46e9-9349-ef51c54b5031 req-3d5ce21d-f880-4b56-9882-89779bc314b4 service nova] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Received unexpected event network-vif-plugged-fe7ac99b-2b51-4ae2-9903-9ae286328c8b for instance with vm_state deleted and task_state None. Apr 20 16:09:48 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:09:48 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:09:48 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "refresh_cache-15d42ba7-cf47-4374-83b5-06d5242951b7" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:09:48 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquired lock "refresh_cache-15d42ba7-cf47-4374-83b5-06d5242951b7" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:09:48 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Forcefully refreshing network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 16:09:48 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Updating instance_info_cache with network_info: [{"id": "e068d7e5-dc70-4b18-8dd6-5726f7a3bc84", "address": "fa:16:3e:15:a2:f4", "network": {"id": "9de26342-0f6c-4d7d-96a5-d4ad35573211", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1378273293-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.9", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fbd2a72dddad4f2892243a33df4fa2d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape068d7e5-dc", "ovs_interfaceid": "e068d7e5-dc70-4b18-8dd6-5726f7a3bc84", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:09:48 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Releasing lock "refresh_cache-15d42ba7-cf47-4374-83b5-06d5242951b7" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:09:48 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Updated the network info_cache for instance {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 16:09:49 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:51 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:09:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:10:01 user nova-compute[71605]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:10:01 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] VM Stopped (Lifecycle Event) Apr 20 16:10:01 user nova-compute[71605]: DEBUG nova.compute.manager [None req-75ea97ab-57ae-4171-bdbf-8e3f9cde4e3c None None] [instance: 4f841186-7958-4642-9050-9b048b61ebbb] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:10:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:10:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:10:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:10:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:10:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:10:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:10:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:10:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:10:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:10:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:10:19 user nova-compute[71605]: DEBUG nova.compute.manager [req-e9ecf419-0596-4461-8de4-853325e2665e req-008ab4f6-efed-4543-bb3f-1cf07fdb460c service nova] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Received event network-changed-9223f738-4299-44ed-8e8f-c39e3353e39d {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:10:19 user nova-compute[71605]: DEBUG nova.compute.manager [req-e9ecf419-0596-4461-8de4-853325e2665e req-008ab4f6-efed-4543-bb3f-1cf07fdb460c service nova] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Refreshing instance network info cache due to event network-changed-9223f738-4299-44ed-8e8f-c39e3353e39d. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:10:19 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-e9ecf419-0596-4461-8de4-853325e2665e req-008ab4f6-efed-4543-bb3f-1cf07fdb460c service nova] Acquiring lock "refresh_cache-f6d19a54-ca7e-46fc-af21-6a7ddbc6604f" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:10:19 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-e9ecf419-0596-4461-8de4-853325e2665e req-008ab4f6-efed-4543-bb3f-1cf07fdb460c service nova] Acquired lock "refresh_cache-f6d19a54-ca7e-46fc-af21-6a7ddbc6604f" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:10:19 user nova-compute[71605]: DEBUG nova.network.neutron [req-e9ecf419-0596-4461-8de4-853325e2665e req-008ab4f6-efed-4543-bb3f-1cf07fdb460c service nova] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Refreshing network info cache for port 9223f738-4299-44ed-8e8f-c39e3353e39d {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:10:20 user nova-compute[71605]: DEBUG nova.network.neutron [req-e9ecf419-0596-4461-8de4-853325e2665e req-008ab4f6-efed-4543-bb3f-1cf07fdb460c service nova] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Updated VIF entry in instance network info cache for port 9223f738-4299-44ed-8e8f-c39e3353e39d. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:10:20 user nova-compute[71605]: DEBUG nova.network.neutron [req-e9ecf419-0596-4461-8de4-853325e2665e req-008ab4f6-efed-4543-bb3f-1cf07fdb460c service nova] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Updating instance_info_cache with network_info: [{"id": "9223f738-4299-44ed-8e8f-c39e3353e39d", "address": "fa:16:3e:d6:0d:19", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.22", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9223f738-42", "ovs_interfaceid": "9223f738-4299-44ed-8e8f-c39e3353e39d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:10:20 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-e9ecf419-0596-4461-8de4-853325e2665e req-008ab4f6-efed-4543-bb3f-1cf07fdb460c service nova] Releasing lock "refresh_cache-f6d19a54-ca7e-46fc-af21-6a7ddbc6604f" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:10:21 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:10:21 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-77e4714d-2df6-4f6e-bddc-fa07b761b845 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquiring lock "f6d19a54-ca7e-46fc-af21-6a7ddbc6604f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:10:21 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-77e4714d-2df6-4f6e-bddc-fa07b761b845 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "f6d19a54-ca7e-46fc-af21-6a7ddbc6604f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.002s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:10:21 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-77e4714d-2df6-4f6e-bddc-fa07b761b845 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquiring lock "f6d19a54-ca7e-46fc-af21-6a7ddbc6604f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:10:21 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-77e4714d-2df6-4f6e-bddc-fa07b761b845 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "f6d19a54-ca7e-46fc-af21-6a7ddbc6604f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:10:21 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-77e4714d-2df6-4f6e-bddc-fa07b761b845 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "f6d19a54-ca7e-46fc-af21-6a7ddbc6604f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:10:21 user nova-compute[71605]: INFO nova.compute.manager [None req-77e4714d-2df6-4f6e-bddc-fa07b761b845 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Terminating instance Apr 20 16:10:21 user nova-compute[71605]: DEBUG nova.compute.manager [None req-77e4714d-2df6-4f6e-bddc-fa07b761b845 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Start destroying the instance on the hypervisor. {{(pid=71605) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 16:10:21 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:10:21 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:10:21 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:10:21 user nova-compute[71605]: DEBUG nova.compute.manager [req-d1a04897-742a-4b42-8603-260c023b3b04 req-100786de-c454-4cf8-950d-74eff3524d11 service nova] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Received event network-vif-unplugged-9223f738-4299-44ed-8e8f-c39e3353e39d {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:10:21 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-d1a04897-742a-4b42-8603-260c023b3b04 req-100786de-c454-4cf8-950d-74eff3524d11 service nova] Acquiring lock "f6d19a54-ca7e-46fc-af21-6a7ddbc6604f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:10:21 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-d1a04897-742a-4b42-8603-260c023b3b04 req-100786de-c454-4cf8-950d-74eff3524d11 service nova] Lock "f6d19a54-ca7e-46fc-af21-6a7ddbc6604f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:10:21 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-d1a04897-742a-4b42-8603-260c023b3b04 req-100786de-c454-4cf8-950d-74eff3524d11 service nova] Lock "f6d19a54-ca7e-46fc-af21-6a7ddbc6604f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:10:21 user nova-compute[71605]: DEBUG nova.compute.manager [req-d1a04897-742a-4b42-8603-260c023b3b04 req-100786de-c454-4cf8-950d-74eff3524d11 service nova] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] No waiting events found dispatching network-vif-unplugged-9223f738-4299-44ed-8e8f-c39e3353e39d {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:10:21 user nova-compute[71605]: DEBUG nova.compute.manager [req-d1a04897-742a-4b42-8603-260c023b3b04 req-100786de-c454-4cf8-950d-74eff3524d11 service nova] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Received event network-vif-unplugged-9223f738-4299-44ed-8e8f-c39e3353e39d for instance with task_state deleting. {{(pid=71605) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 16:10:22 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Instance destroyed successfully. Apr 20 16:10:22 user nova-compute[71605]: DEBUG nova.objects.instance [None req-77e4714d-2df6-4f6e-bddc-fa07b761b845 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lazy-loading 'resources' on Instance uuid f6d19a54-ca7e-46fc-af21-6a7ddbc6604f {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:10:22 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-77e4714d-2df6-4f6e-bddc-fa07b761b845 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:08:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-407901735',display_name='tempest-AttachVolumeNegativeTest-server-407901735',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-407901735',id=18,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG4OSrhcAGhvt/1td6lSrBTjgRGg10CjLCL1EmuHW6q7czt1RgqBpWAsoiQyoSTiBzeuddL47KN04jWageIBB5Wx1XgbbdYqtpRoz/r1eG4scj8/SDy6MikQDo96K7/ZPw==',key_name='tempest-keypair-627338015',keypairs=,launch_index=0,launched_at=2023-04-20T16:08:36Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='71cf2664111f45788d24092e8ceede9c',ramdisk_id='',reservation_id='r-wsud3dti',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-308436039',owner_user_name='tempest-AttachVolumeNegativeTest-308436039-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T16:08:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='690c49feae904687826fb959ba5ba283',uuid=f6d19a54-ca7e-46fc-af21-6a7ddbc6604f,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9223f738-4299-44ed-8e8f-c39e3353e39d", "address": "fa:16:3e:d6:0d:19", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.22", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9223f738-42", "ovs_interfaceid": "9223f738-4299-44ed-8e8f-c39e3353e39d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 16:10:22 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-77e4714d-2df6-4f6e-bddc-fa07b761b845 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Converting VIF {"id": "9223f738-4299-44ed-8e8f-c39e3353e39d", "address": "fa:16:3e:d6:0d:19", "network": {"id": "2dc9b3da-0124-4718-9f70-a131cd030480", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-766632698-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.22", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "71cf2664111f45788d24092e8ceede9c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap9223f738-42", "ovs_interfaceid": "9223f738-4299-44ed-8e8f-c39e3353e39d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:10:22 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-77e4714d-2df6-4f6e-bddc-fa07b761b845 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d6:0d:19,bridge_name='br-int',has_traffic_filtering=True,id=9223f738-4299-44ed-8e8f-c39e3353e39d,network=Network(2dc9b3da-0124-4718-9f70-a131cd030480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9223f738-42') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:10:22 user nova-compute[71605]: DEBUG os_vif [None req-77e4714d-2df6-4f6e-bddc-fa07b761b845 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d6:0d:19,bridge_name='br-int',has_traffic_filtering=True,id=9223f738-4299-44ed-8e8f-c39e3353e39d,network=Network(2dc9b3da-0124-4718-9f70-a131cd030480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9223f738-42') {{(pid=71605) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 16:10:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:10:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9223f738-42, bridge=br-int, if_exists=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:10:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:10:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:10:22 user nova-compute[71605]: INFO os_vif [None req-77e4714d-2df6-4f6e-bddc-fa07b761b845 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d6:0d:19,bridge_name='br-int',has_traffic_filtering=True,id=9223f738-4299-44ed-8e8f-c39e3353e39d,network=Network(2dc9b3da-0124-4718-9f70-a131cd030480),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9223f738-42') Apr 20 16:10:22 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-77e4714d-2df6-4f6e-bddc-fa07b761b845 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Deleting instance files /opt/stack/data/nova/instances/f6d19a54-ca7e-46fc-af21-6a7ddbc6604f_del Apr 20 16:10:22 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-77e4714d-2df6-4f6e-bddc-fa07b761b845 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Deletion of /opt/stack/data/nova/instances/f6d19a54-ca7e-46fc-af21-6a7ddbc6604f_del complete Apr 20 16:10:22 user nova-compute[71605]: INFO nova.compute.manager [None req-77e4714d-2df6-4f6e-bddc-fa07b761b845 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Took 0.85 seconds to destroy the instance on the hypervisor. Apr 20 16:10:22 user nova-compute[71605]: DEBUG oslo.service.loopingcall [None req-77e4714d-2df6-4f6e-bddc-fa07b761b845 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71605) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 16:10:22 user nova-compute[71605]: DEBUG nova.compute.manager [-] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Deallocating network for instance {{(pid=71605) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 16:10:22 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] deallocate_for_instance() {{(pid=71605) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 16:10:23 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:10:23 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Took 1.19 seconds to deallocate network for instance. Apr 20 16:10:23 user nova-compute[71605]: DEBUG nova.compute.manager [req-11036519-df74-4a0e-891a-d3e44962d6d3 req-dd7064d1-561a-4ddd-befd-02ddffeac652 service nova] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Received event network-vif-deleted-9223f738-4299-44ed-8e8f-c39e3353e39d {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:10:23 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-77e4714d-2df6-4f6e-bddc-fa07b761b845 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:10:23 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-77e4714d-2df6-4f6e-bddc-fa07b761b845 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:10:23 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-77e4714d-2df6-4f6e-bddc-fa07b761b845 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:10:23 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-77e4714d-2df6-4f6e-bddc-fa07b761b845 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:10:23 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-77e4714d-2df6-4f6e-bddc-fa07b761b845 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.166s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:10:23 user nova-compute[71605]: INFO nova.scheduler.client.report [None req-77e4714d-2df6-4f6e-bddc-fa07b761b845 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Deleted allocations for instance f6d19a54-ca7e-46fc-af21-6a7ddbc6604f Apr 20 16:10:23 user nova-compute[71605]: DEBUG nova.compute.manager [req-3fdd4c96-3777-4fa0-a0e8-d481c4111bde req-5ab0b75e-9a8f-4a23-b3b5-0324ee675dbe service nova] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Received event network-vif-plugged-9223f738-4299-44ed-8e8f-c39e3353e39d {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:10:23 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-3fdd4c96-3777-4fa0-a0e8-d481c4111bde req-5ab0b75e-9a8f-4a23-b3b5-0324ee675dbe service nova] Acquiring lock "f6d19a54-ca7e-46fc-af21-6a7ddbc6604f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:10:23 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-3fdd4c96-3777-4fa0-a0e8-d481c4111bde req-5ab0b75e-9a8f-4a23-b3b5-0324ee675dbe service nova] Lock "f6d19a54-ca7e-46fc-af21-6a7ddbc6604f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:10:23 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-3fdd4c96-3777-4fa0-a0e8-d481c4111bde req-5ab0b75e-9a8f-4a23-b3b5-0324ee675dbe service nova] Lock "f6d19a54-ca7e-46fc-af21-6a7ddbc6604f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:10:23 user nova-compute[71605]: DEBUG nova.compute.manager [req-3fdd4c96-3777-4fa0-a0e8-d481c4111bde req-5ab0b75e-9a8f-4a23-b3b5-0324ee675dbe service nova] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] No waiting events found dispatching network-vif-plugged-9223f738-4299-44ed-8e8f-c39e3353e39d {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:10:23 user nova-compute[71605]: WARNING nova.compute.manager [req-3fdd4c96-3777-4fa0-a0e8-d481c4111bde req-5ab0b75e-9a8f-4a23-b3b5-0324ee675dbe service nova] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Received unexpected event network-vif-plugged-9223f738-4299-44ed-8e8f-c39e3353e39d for instance with vm_state deleted and task_state None. Apr 20 16:10:23 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-77e4714d-2df6-4f6e-bddc-fa07b761b845 tempest-AttachVolumeNegativeTest-308436039 tempest-AttachVolumeNegativeTest-308436039-project-member] Lock "f6d19a54-ca7e-46fc-af21-6a7ddbc6604f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.398s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:10:24 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:10:27 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:10:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:10:37 user nova-compute[71605]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:10:37 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] VM Stopped (Lifecycle Event) Apr 20 16:10:37 user nova-compute[71605]: DEBUG nova.compute.manager [None req-18ad945a-9ffc-4295-b05f-0fcd5906c407 None None] [instance: f6d19a54-ca7e-46fc-af21-6a7ddbc6604f] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:10:37 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:10:42 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:10:42 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:10:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:10:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:10:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:10:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:10:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:10:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:10:43 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:10:44 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:10:44 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:10:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:10:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:10:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:10:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:10:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:10:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:10:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:10:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:10:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:10:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:10:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:10:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:10:45 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:10:45 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:10:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=8892MB free_disk=26.374759674072266GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:10:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:10:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:10:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance d4ea4d29-b178-4da2-b971-76f97031b244 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:10:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance 15d42ba7-cf47-4374-83b5-06d5242951b7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:10:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:10:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:10:45 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:10:45 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:10:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:10:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.221s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:10:46 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:10:46 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:10:46 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:10:46 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:10:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:10:48 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:10:48 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:10:48 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Didn't find any instances for network info cache update. {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 20 16:10:49 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:10:50 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:10:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:10:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:10:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:10:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:10:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:10:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:10:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:10:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:10:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:10:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:03 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:04 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:07 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:09 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Acquiring lock "6766bf80-e99c-4363-b229-84049f21b1a2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:11:09 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Lock "6766bf80-e99c-4363-b229-84049f21b1a2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:11:09 user nova-compute[71605]: DEBUG nova.compute.manager [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Starting instance... {{(pid=71605) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 16:11:09 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:11:09 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:11:09 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71605) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 16:11:09 user nova-compute[71605]: INFO nova.compute.claims [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Claim successful on node user Apr 20 16:11:09 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:11:09 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:11:09 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.225s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:11:09 user nova-compute[71605]: DEBUG nova.compute.manager [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Start building networks asynchronously for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 16:11:09 user nova-compute[71605]: DEBUG nova.compute.manager [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Allocating IP information in the background. {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 16:11:09 user nova-compute[71605]: DEBUG nova.network.neutron [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] allocate_for_instance() {{(pid=71605) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 16:11:09 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 16:11:09 user nova-compute[71605]: DEBUG nova.compute.manager [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Start building block device mappings for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 16:11:09 user nova-compute[71605]: DEBUG nova.policy [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '851a3140769443e4b62b73b987e2b417', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3cc4f384caf14fb3baff01938400bb4b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71605) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 16:11:09 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:09 user nova-compute[71605]: DEBUG nova.compute.manager [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Start spawning the instance on the hypervisor. {{(pid=71605) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 16:11:09 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Creating instance directory {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 16:11:09 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Creating image(s) Apr 20 16:11:09 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Acquiring lock "/opt/stack/data/nova/instances/6766bf80-e99c-4363-b229-84049f21b1a2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:11:09 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Lock "/opt/stack/data/nova/instances/6766bf80-e99c-4363-b229-84049f21b1a2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:11:09 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Lock "/opt/stack/data/nova/instances/6766bf80-e99c-4363-b229-84049f21b1a2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:11:09 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:11:09 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.130s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:11:09 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Acquiring lock "4030659dc9e6940e4f224066d06e3784b1229890" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:11:09 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:11:09 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:11:09 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.133s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:11:09 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/6766bf80-e99c-4363-b229-84049f21b1a2/disk 1073741824 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:11:09 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/6766bf80-e99c-4363-b229-84049f21b1a2/disk 1073741824" returned: 0 in 0.071s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:11:09 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.211s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:11:09 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:11:10 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.138s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:11:10 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Checking if we can resize image /opt/stack/data/nova/instances/6766bf80-e99c-4363-b229-84049f21b1a2/disk. size=1073741824 {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 16:11:10 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6766bf80-e99c-4363-b229-84049f21b1a2/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:11:10 user nova-compute[71605]: DEBUG nova.network.neutron [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Successfully created port: 6ee15985-c252-4270-ad30-f6f5efe94b86 {{(pid=71605) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 16:11:10 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6766bf80-e99c-4363-b229-84049f21b1a2/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:11:10 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Cannot resize image /opt/stack/data/nova/instances/6766bf80-e99c-4363-b229-84049f21b1a2/disk to a smaller size. {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 16:11:10 user nova-compute[71605]: DEBUG nova.objects.instance [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Lazy-loading 'migration_context' on Instance uuid 6766bf80-e99c-4363-b229-84049f21b1a2 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:11:10 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Created local disks {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 16:11:10 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Ensure instance console log exists: /opt/stack/data/nova/instances/6766bf80-e99c-4363-b229-84049f21b1a2/console.log {{(pid=71605) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 16:11:10 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:11:10 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:11:10 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:11:10 user nova-compute[71605]: DEBUG nova.network.neutron [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Successfully updated port: 6ee15985-c252-4270-ad30-f6f5efe94b86 {{(pid=71605) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 16:11:10 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Acquiring lock "refresh_cache-6766bf80-e99c-4363-b229-84049f21b1a2" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:11:10 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Acquired lock "refresh_cache-6766bf80-e99c-4363-b229-84049f21b1a2" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:11:10 user nova-compute[71605]: DEBUG nova.network.neutron [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Building network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 16:11:10 user nova-compute[71605]: DEBUG nova.compute.manager [req-064621c4-536b-4b6f-96ab-5613f99cbf9c req-e80ac663-728e-4d59-b270-58cb411b1859 service nova] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Received event network-changed-6ee15985-c252-4270-ad30-f6f5efe94b86 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:11:10 user nova-compute[71605]: DEBUG nova.compute.manager [req-064621c4-536b-4b6f-96ab-5613f99cbf9c req-e80ac663-728e-4d59-b270-58cb411b1859 service nova] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Refreshing instance network info cache due to event network-changed-6ee15985-c252-4270-ad30-f6f5efe94b86. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:11:10 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-064621c4-536b-4b6f-96ab-5613f99cbf9c req-e80ac663-728e-4d59-b270-58cb411b1859 service nova] Acquiring lock "refresh_cache-6766bf80-e99c-4363-b229-84049f21b1a2" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:11:10 user nova-compute[71605]: DEBUG nova.network.neutron [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Instance cache missing network info. {{(pid=71605) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG nova.network.neutron [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Updating instance_info_cache with network_info: [{"id": "6ee15985-c252-4270-ad30-f6f5efe94b86", "address": "fa:16:3e:7f:f1:65", "network": {"id": "9ac82ace-9d10-48a7-8d31-7e6637f442b4", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-98485346-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3cc4f384caf14fb3baff01938400bb4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ee15985-c2", "ovs_interfaceid": "6ee15985-c252-4270-ad30-f6f5efe94b86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Releasing lock "refresh_cache-6766bf80-e99c-4363-b229-84049f21b1a2" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG nova.compute.manager [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Instance network_info: |[{"id": "6ee15985-c252-4270-ad30-f6f5efe94b86", "address": "fa:16:3e:7f:f1:65", "network": {"id": "9ac82ace-9d10-48a7-8d31-7e6637f442b4", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-98485346-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3cc4f384caf14fb3baff01938400bb4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ee15985-c2", "ovs_interfaceid": "6ee15985-c252-4270-ad30-f6f5efe94b86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-064621c4-536b-4b6f-96ab-5613f99cbf9c req-e80ac663-728e-4d59-b270-58cb411b1859 service nova] Acquired lock "refresh_cache-6766bf80-e99c-4363-b229-84049f21b1a2" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG nova.network.neutron [req-064621c4-536b-4b6f-96ab-5613f99cbf9c req-e80ac663-728e-4d59-b270-58cb411b1859 service nova] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Refreshing network info cache for port 6ee15985-c252-4270-ad30-f6f5efe94b86 {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Start _get_guest_xml network_info=[{"id": "6ee15985-c252-4270-ad30-f6f5efe94b86", "address": "fa:16:3e:7f:f1:65", "network": {"id": "9ac82ace-9d10-48a7-8d31-7e6637f442b4", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-98485346-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3cc4f384caf14fb3baff01938400bb4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ee15985-c2", "ovs_interfaceid": "6ee15985-c252-4270-ad30-f6f5efe94b86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'size': 0, 'encryption_format': None, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'image_id': '4ac69ea5-e5d7-40c8-864e-0a164d78a727'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 16:11:11 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:11:11 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:11:11 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71605) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T16:00:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=), allow threads: True {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Flavor limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Image limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Flavor pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Image pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Got 1 possible topologies {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:11:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SnapshotDataIntegrityTests-server-771415346',display_name='tempest-SnapshotDataIntegrityTests-server-771415346',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-snapshotdataintegritytests-server-771415346',id=19,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL7kqNWnEn2hJiKbn+roVzFAUQv+vrvS2keY49vWi3kz6H/X1mSRhv0lb/LsmmiG61nDcYpkJebf4uXL/6YHuVK1fQRP8/2MwnEkEV1Hk939UStYtsqYSVjo2UDwQtWlYg==',key_name='tempest-SnapshotDataIntegrityTests-1504266769',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3cc4f384caf14fb3baff01938400bb4b',ramdisk_id='',reservation_id='r-gi8180jx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-SnapshotDataIntegrityTests-901842767',owner_user_name='tempest-SnapshotDataIntegrityTests-901842767-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:11:10Z,user_data=None,user_id='851a3140769443e4b62b73b987e2b417',uuid=6766bf80-e99c-4363-b229-84049f21b1a2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6ee15985-c252-4270-ad30-f6f5efe94b86", "address": "fa:16:3e:7f:f1:65", "network": {"id": "9ac82ace-9d10-48a7-8d31-7e6637f442b4", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-98485346-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3cc4f384caf14fb3baff01938400bb4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ee15985-c2", "ovs_interfaceid": "6ee15985-c252-4270-ad30-f6f5efe94b86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71605) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Converting VIF {"id": "6ee15985-c252-4270-ad30-f6f5efe94b86", "address": "fa:16:3e:7f:f1:65", "network": {"id": "9ac82ace-9d10-48a7-8d31-7e6637f442b4", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-98485346-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3cc4f384caf14fb3baff01938400bb4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ee15985-c2", "ovs_interfaceid": "6ee15985-c252-4270-ad30-f6f5efe94b86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:f1:65,bridge_name='br-int',has_traffic_filtering=True,id=6ee15985-c252-4270-ad30-f6f5efe94b86,network=Network(9ac82ace-9d10-48a7-8d31-7e6637f442b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ee15985-c2') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG nova.objects.instance [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Lazy-loading 'pci_devices' on Instance uuid 6766bf80-e99c-4363-b229-84049f21b1a2 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] End _get_guest_xml xml= Apr 20 16:11:11 user nova-compute[71605]: 6766bf80-e99c-4363-b229-84049f21b1a2 Apr 20 16:11:11 user nova-compute[71605]: instance-00000013 Apr 20 16:11:11 user nova-compute[71605]: 131072 Apr 20 16:11:11 user nova-compute[71605]: 1 Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: tempest-SnapshotDataIntegrityTests-server-771415346 Apr 20 16:11:11 user nova-compute[71605]: 2023-04-20 16:11:11 Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: 128 Apr 20 16:11:11 user nova-compute[71605]: 1 Apr 20 16:11:11 user nova-compute[71605]: 0 Apr 20 16:11:11 user nova-compute[71605]: 0 Apr 20 16:11:11 user nova-compute[71605]: 1 Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: tempest-SnapshotDataIntegrityTests-901842767-project-member Apr 20 16:11:11 user nova-compute[71605]: tempest-SnapshotDataIntegrityTests-901842767 Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: OpenStack Foundation Apr 20 16:11:11 user nova-compute[71605]: OpenStack Nova Apr 20 16:11:11 user nova-compute[71605]: 0.0.0 Apr 20 16:11:11 user nova-compute[71605]: 6766bf80-e99c-4363-b229-84049f21b1a2 Apr 20 16:11:11 user nova-compute[71605]: 6766bf80-e99c-4363-b229-84049f21b1a2 Apr 20 16:11:11 user nova-compute[71605]: Virtual Machine Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: hvm Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Nehalem Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: /dev/urandom Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: Apr 20 16:11:11 user nova-compute[71605]: {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:11:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SnapshotDataIntegrityTests-server-771415346',display_name='tempest-SnapshotDataIntegrityTests-server-771415346',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-snapshotdataintegritytests-server-771415346',id=19,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL7kqNWnEn2hJiKbn+roVzFAUQv+vrvS2keY49vWi3kz6H/X1mSRhv0lb/LsmmiG61nDcYpkJebf4uXL/6YHuVK1fQRP8/2MwnEkEV1Hk939UStYtsqYSVjo2UDwQtWlYg==',key_name='tempest-SnapshotDataIntegrityTests-1504266769',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3cc4f384caf14fb3baff01938400bb4b',ramdisk_id='',reservation_id='r-gi8180jx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-SnapshotDataIntegrityTests-901842767',owner_user_name='tempest-SnapshotDataIntegrityTests-901842767-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:11:10Z,user_data=None,user_id='851a3140769443e4b62b73b987e2b417',uuid=6766bf80-e99c-4363-b229-84049f21b1a2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6ee15985-c252-4270-ad30-f6f5efe94b86", "address": "fa:16:3e:7f:f1:65", "network": {"id": "9ac82ace-9d10-48a7-8d31-7e6637f442b4", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-98485346-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3cc4f384caf14fb3baff01938400bb4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ee15985-c2", "ovs_interfaceid": "6ee15985-c252-4270-ad30-f6f5efe94b86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Converting VIF {"id": "6ee15985-c252-4270-ad30-f6f5efe94b86", "address": "fa:16:3e:7f:f1:65", "network": {"id": "9ac82ace-9d10-48a7-8d31-7e6637f442b4", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-98485346-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3cc4f384caf14fb3baff01938400bb4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ee15985-c2", "ovs_interfaceid": "6ee15985-c252-4270-ad30-f6f5efe94b86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:f1:65,bridge_name='br-int',has_traffic_filtering=True,id=6ee15985-c252-4270-ad30-f6f5efe94b86,network=Network(9ac82ace-9d10-48a7-8d31-7e6637f442b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ee15985-c2') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG os_vif [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:f1:65,bridge_name='br-int',has_traffic_filtering=True,id=6ee15985-c252-4270-ad30-f6f5efe94b86,network=Network(9ac82ace-9d10-48a7-8d31-7e6637f442b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ee15985-c2') {{(pid=71605) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6ee15985-c2, may_exist=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6ee15985-c2, col_values=(('external_ids', {'iface-id': '6ee15985-c252-4270-ad30-f6f5efe94b86', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7f:f1:65', 'vm-uuid': '6766bf80-e99c-4363-b229-84049f21b1a2'}),)) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:11 user nova-compute[71605]: INFO os_vif [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:f1:65,bridge_name='br-int',has_traffic_filtering=True,id=6ee15985-c252-4270-ad30-f6f5efe94b86,network=Network(9ac82ace-9d10-48a7-8d31-7e6637f442b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ee15985-c2') Apr 20 16:11:11 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] No BDM found with device name vda, not building metadata. {{(pid=71605) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] No VIF found with MAC fa:16:3e:7f:f1:65, not building metadata {{(pid=71605) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG nova.network.neutron [req-064621c4-536b-4b6f-96ab-5613f99cbf9c req-e80ac663-728e-4d59-b270-58cb411b1859 service nova] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Updated VIF entry in instance network info cache for port 6ee15985-c252-4270-ad30-f6f5efe94b86. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG nova.network.neutron [req-064621c4-536b-4b6f-96ab-5613f99cbf9c req-e80ac663-728e-4d59-b270-58cb411b1859 service nova] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Updating instance_info_cache with network_info: [{"id": "6ee15985-c252-4270-ad30-f6f5efe94b86", "address": "fa:16:3e:7f:f1:65", "network": {"id": "9ac82ace-9d10-48a7-8d31-7e6637f442b4", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-98485346-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3cc4f384caf14fb3baff01938400bb4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ee15985-c2", "ovs_interfaceid": "6ee15985-c252-4270-ad30-f6f5efe94b86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:11:11 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-064621c4-536b-4b6f-96ab-5613f99cbf9c req-e80ac663-728e-4d59-b270-58cb411b1859 service nova] Releasing lock "refresh_cache-6766bf80-e99c-4363-b229-84049f21b1a2" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:11:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:12 user nova-compute[71605]: DEBUG nova.compute.manager [req-6f334ffa-cc07-44d1-9ffc-6297da423708 req-4c214097-010c-46ce-9e4e-da7e07da31da service nova] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Received event network-vif-plugged-6ee15985-c252-4270-ad30-f6f5efe94b86 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:11:12 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-6f334ffa-cc07-44d1-9ffc-6297da423708 req-4c214097-010c-46ce-9e4e-da7e07da31da service nova] Acquiring lock "6766bf80-e99c-4363-b229-84049f21b1a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:11:12 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-6f334ffa-cc07-44d1-9ffc-6297da423708 req-4c214097-010c-46ce-9e4e-da7e07da31da service nova] Lock "6766bf80-e99c-4363-b229-84049f21b1a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:11:12 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-6f334ffa-cc07-44d1-9ffc-6297da423708 req-4c214097-010c-46ce-9e4e-da7e07da31da service nova] Lock "6766bf80-e99c-4363-b229-84049f21b1a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:11:12 user nova-compute[71605]: DEBUG nova.compute.manager [req-6f334ffa-cc07-44d1-9ffc-6297da423708 req-4c214097-010c-46ce-9e4e-da7e07da31da service nova] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] No waiting events found dispatching network-vif-plugged-6ee15985-c252-4270-ad30-f6f5efe94b86 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:11:12 user nova-compute[71605]: WARNING nova.compute.manager [req-6f334ffa-cc07-44d1-9ffc-6297da423708 req-4c214097-010c-46ce-9e4e-da7e07da31da service nova] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Received unexpected event network-vif-plugged-6ee15985-c252-4270-ad30-f6f5efe94b86 for instance with vm_state building and task_state spawning. Apr 20 16:11:13 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:13 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:13 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:14 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:14 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Resumed> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:11:14 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] VM Resumed (Lifecycle Event) Apr 20 16:11:14 user nova-compute[71605]: DEBUG nova.compute.manager [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Instance event wait completed in 0 seconds for {{(pid=71605) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 16:11:14 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Guest created on hypervisor {{(pid=71605) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 16:11:14 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Instance spawned successfully. Apr 20 16:11:14 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 16:11:14 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:11:14 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:11:14 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Found default for hw_cdrom_bus of ide {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:11:14 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Found default for hw_disk_bus of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:11:14 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Found default for hw_input_bus of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:11:14 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Found default for hw_pointer_model of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:11:14 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Found default for hw_video_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:11:14 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Found default for hw_vif_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:11:14 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:11:14 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Started> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:11:14 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] VM Started (Lifecycle Event) Apr 20 16:11:14 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:11:14 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:11:14 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:11:14 user nova-compute[71605]: INFO nova.compute.manager [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Took 5.29 seconds to spawn the instance on the hypervisor. Apr 20 16:11:14 user nova-compute[71605]: DEBUG nova.compute.manager [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:11:14 user nova-compute[71605]: INFO nova.compute.manager [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Took 5.82 seconds to build instance. Apr 20 16:11:14 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-c669bb97-0988-4237-ba99-55abe29e5c3d tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Lock "6766bf80-e99c-4363-b229-84049f21b1a2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 5.911s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:11:15 user nova-compute[71605]: DEBUG nova.compute.manager [req-cceca842-3d3d-42e7-9635-4670b1e8c6a1 req-2fe5b254-ccf9-4614-bea7-23d22c93d3b3 service nova] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Received event network-vif-plugged-6ee15985-c252-4270-ad30-f6f5efe94b86 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:11:15 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-cceca842-3d3d-42e7-9635-4670b1e8c6a1 req-2fe5b254-ccf9-4614-bea7-23d22c93d3b3 service nova] Acquiring lock "6766bf80-e99c-4363-b229-84049f21b1a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:11:15 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-cceca842-3d3d-42e7-9635-4670b1e8c6a1 req-2fe5b254-ccf9-4614-bea7-23d22c93d3b3 service nova] Lock "6766bf80-e99c-4363-b229-84049f21b1a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:11:15 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-cceca842-3d3d-42e7-9635-4670b1e8c6a1 req-2fe5b254-ccf9-4614-bea7-23d22c93d3b3 service nova] Lock "6766bf80-e99c-4363-b229-84049f21b1a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:11:15 user nova-compute[71605]: DEBUG nova.compute.manager [req-cceca842-3d3d-42e7-9635-4670b1e8c6a1 req-2fe5b254-ccf9-4614-bea7-23d22c93d3b3 service nova] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] No waiting events found dispatching network-vif-plugged-6ee15985-c252-4270-ad30-f6f5efe94b86 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:11:15 user nova-compute[71605]: WARNING nova.compute.manager [req-cceca842-3d3d-42e7-9635-4670b1e8c6a1 req-2fe5b254-ccf9-4614-bea7-23d22c93d3b3 service nova] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Received unexpected event network-vif-plugged-6ee15985-c252-4270-ad30-f6f5efe94b86 for instance with vm_state active and task_state None. Apr 20 16:11:15 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:19 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:21 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:11:29 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:30 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f62dd740-316f-415d-907a-d2d1a0a8acf4 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Acquiring lock "d4ea4d29-b178-4da2-b971-76f97031b244" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:11:30 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f62dd740-316f-415d-907a-d2d1a0a8acf4 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "d4ea4d29-b178-4da2-b971-76f97031b244" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:11:30 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f62dd740-316f-415d-907a-d2d1a0a8acf4 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Acquiring lock "d4ea4d29-b178-4da2-b971-76f97031b244-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:11:30 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f62dd740-316f-415d-907a-d2d1a0a8acf4 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "d4ea4d29-b178-4da2-b971-76f97031b244-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:11:30 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f62dd740-316f-415d-907a-d2d1a0a8acf4 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "d4ea4d29-b178-4da2-b971-76f97031b244-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:11:30 user nova-compute[71605]: INFO nova.compute.manager [None req-f62dd740-316f-415d-907a-d2d1a0a8acf4 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Terminating instance Apr 20 16:11:30 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f62dd740-316f-415d-907a-d2d1a0a8acf4 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Start destroying the instance on the hypervisor. {{(pid=71605) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 16:11:30 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:30 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:30 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:30 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:30 user nova-compute[71605]: DEBUG nova.compute.manager [req-b61016d1-9bd8-414c-840c-46bb8c02bda4 req-c082683b-abdf-4c4d-9e2d-5f16b4e42f38 service nova] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Received event network-vif-unplugged-0b36b1a4-9ab6-49cb-9a5e-afc32792783e {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:11:30 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-b61016d1-9bd8-414c-840c-46bb8c02bda4 req-c082683b-abdf-4c4d-9e2d-5f16b4e42f38 service nova] Acquiring lock "d4ea4d29-b178-4da2-b971-76f97031b244-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:11:30 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-b61016d1-9bd8-414c-840c-46bb8c02bda4 req-c082683b-abdf-4c4d-9e2d-5f16b4e42f38 service nova] Lock "d4ea4d29-b178-4da2-b971-76f97031b244-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:11:30 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-b61016d1-9bd8-414c-840c-46bb8c02bda4 req-c082683b-abdf-4c4d-9e2d-5f16b4e42f38 service nova] Lock "d4ea4d29-b178-4da2-b971-76f97031b244-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:11:30 user nova-compute[71605]: DEBUG nova.compute.manager [req-b61016d1-9bd8-414c-840c-46bb8c02bda4 req-c082683b-abdf-4c4d-9e2d-5f16b4e42f38 service nova] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] No waiting events found dispatching network-vif-unplugged-0b36b1a4-9ab6-49cb-9a5e-afc32792783e {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:11:30 user nova-compute[71605]: DEBUG nova.compute.manager [req-b61016d1-9bd8-414c-840c-46bb8c02bda4 req-c082683b-abdf-4c4d-9e2d-5f16b4e42f38 service nova] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Received event network-vif-unplugged-0b36b1a4-9ab6-49cb-9a5e-afc32792783e for instance with task_state deleting. {{(pid=71605) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 16:11:30 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Instance destroyed successfully. Apr 20 16:11:30 user nova-compute[71605]: DEBUG nova.objects.instance [None req-f62dd740-316f-415d-907a-d2d1a0a8acf4 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lazy-loading 'resources' on Instance uuid d4ea4d29-b178-4da2-b971-76f97031b244 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:11:31 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-f62dd740-316f-415d-907a-d2d1a0a8acf4 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:02:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1335393395',display_name='tempest-ServersNegativeTestJSON-server-1335393395',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1335393395',id=2,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-20T16:03:12Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='d8444d3c8f554a56967917670b19dc37',ramdisk_id='',reservation_id='r-955d2plh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersNegativeTestJSON-942369263',owner_user_name='tempest-ServersNegativeTestJSON-942369263-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T16:03:12Z,user_data=None,user_id='9be25e958c6047068ab5ce63106b0754',uuid=d4ea4d29-b178-4da2-b971-76f97031b244,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b36b1a4-9ab6-49cb-9a5e-afc32792783e", "address": "fa:16:3e:44:d8:d0", "network": {"id": "c36830a6-66f7-4f28-8879-e228da46cead", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-655574662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8444d3c8f554a56967917670b19dc37", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b36b1a4-9a", "ovs_interfaceid": "0b36b1a4-9ab6-49cb-9a5e-afc32792783e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 16:11:31 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-f62dd740-316f-415d-907a-d2d1a0a8acf4 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Converting VIF {"id": "0b36b1a4-9ab6-49cb-9a5e-afc32792783e", "address": "fa:16:3e:44:d8:d0", "network": {"id": "c36830a6-66f7-4f28-8879-e228da46cead", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-655574662-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "d8444d3c8f554a56967917670b19dc37", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b36b1a4-9a", "ovs_interfaceid": "0b36b1a4-9ab6-49cb-9a5e-afc32792783e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:11:31 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-f62dd740-316f-415d-907a-d2d1a0a8acf4 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:44:d8:d0,bridge_name='br-int',has_traffic_filtering=True,id=0b36b1a4-9ab6-49cb-9a5e-afc32792783e,network=Network(c36830a6-66f7-4f28-8879-e228da46cead),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b36b1a4-9a') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:11:31 user nova-compute[71605]: DEBUG os_vif [None req-f62dd740-316f-415d-907a-d2d1a0a8acf4 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:44:d8:d0,bridge_name='br-int',has_traffic_filtering=True,id=0b36b1a4-9ab6-49cb-9a5e-afc32792783e,network=Network(c36830a6-66f7-4f28-8879-e228da46cead),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b36b1a4-9a') {{(pid=71605) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 16:11:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0b36b1a4-9a, bridge=br-int, if_exists=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:11:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:31 user nova-compute[71605]: INFO os_vif [None req-f62dd740-316f-415d-907a-d2d1a0a8acf4 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:44:d8:d0,bridge_name='br-int',has_traffic_filtering=True,id=0b36b1a4-9ab6-49cb-9a5e-afc32792783e,network=Network(c36830a6-66f7-4f28-8879-e228da46cead),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b36b1a4-9a') Apr 20 16:11:31 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-f62dd740-316f-415d-907a-d2d1a0a8acf4 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Deleting instance files /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244_del Apr 20 16:11:31 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-f62dd740-316f-415d-907a-d2d1a0a8acf4 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Deletion of /opt/stack/data/nova/instances/d4ea4d29-b178-4da2-b971-76f97031b244_del complete Apr 20 16:11:31 user nova-compute[71605]: INFO nova.compute.manager [None req-f62dd740-316f-415d-907a-d2d1a0a8acf4 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Took 0.78 seconds to destroy the instance on the hypervisor. Apr 20 16:11:31 user nova-compute[71605]: DEBUG oslo.service.loopingcall [None req-f62dd740-316f-415d-907a-d2d1a0a8acf4 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71605) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 16:11:31 user nova-compute[71605]: DEBUG nova.compute.manager [-] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Deallocating network for instance {{(pid=71605) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 16:11:31 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] deallocate_for_instance() {{(pid=71605) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 16:11:31 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:11:31 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Took 0.54 seconds to deallocate network for instance. Apr 20 16:11:31 user nova-compute[71605]: DEBUG nova.compute.manager [req-cdbd0bb2-1072-4624-b7a1-a676f8962652 req-cc36dc29-76c9-4d43-b648-087a87d4e54a service nova] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Received event network-vif-deleted-0b36b1a4-9ab6-49cb-9a5e-afc32792783e {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:11:31 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f62dd740-316f-415d-907a-d2d1a0a8acf4 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:11:31 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f62dd740-316f-415d-907a-d2d1a0a8acf4 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:11:31 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f62dd740-316f-415d-907a-d2d1a0a8acf4 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:11:31 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f62dd740-316f-415d-907a-d2d1a0a8acf4 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:11:31 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f62dd740-316f-415d-907a-d2d1a0a8acf4 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.165s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:11:31 user nova-compute[71605]: INFO nova.scheduler.client.report [None req-f62dd740-316f-415d-907a-d2d1a0a8acf4 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Deleted allocations for instance d4ea4d29-b178-4da2-b971-76f97031b244 Apr 20 16:11:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f62dd740-316f-415d-907a-d2d1a0a8acf4 tempest-ServersNegativeTestJSON-942369263 tempest-ServersNegativeTestJSON-942369263-project-member] Lock "d4ea4d29-b178-4da2-b971-76f97031b244" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.653s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:11:32 user nova-compute[71605]: DEBUG nova.compute.manager [req-fa5e9bba-f351-46cc-944b-7ac7b620868a req-c2b09e7b-fa78-4073-8b78-67034443f8e5 service nova] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Received event network-vif-plugged-0b36b1a4-9ab6-49cb-9a5e-afc32792783e {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:11:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-fa5e9bba-f351-46cc-944b-7ac7b620868a req-c2b09e7b-fa78-4073-8b78-67034443f8e5 service nova] Acquiring lock "d4ea4d29-b178-4da2-b971-76f97031b244-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:11:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-fa5e9bba-f351-46cc-944b-7ac7b620868a req-c2b09e7b-fa78-4073-8b78-67034443f8e5 service nova] Lock "d4ea4d29-b178-4da2-b971-76f97031b244-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:11:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-fa5e9bba-f351-46cc-944b-7ac7b620868a req-c2b09e7b-fa78-4073-8b78-67034443f8e5 service nova] Lock "d4ea4d29-b178-4da2-b971-76f97031b244-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:11:32 user nova-compute[71605]: DEBUG nova.compute.manager [req-fa5e9bba-f351-46cc-944b-7ac7b620868a req-c2b09e7b-fa78-4073-8b78-67034443f8e5 service nova] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] No waiting events found dispatching network-vif-plugged-0b36b1a4-9ab6-49cb-9a5e-afc32792783e {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:11:32 user nova-compute[71605]: WARNING nova.compute.manager [req-fa5e9bba-f351-46cc-944b-7ac7b620868a req-c2b09e7b-fa78-4073-8b78-67034443f8e5 service nova] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Received unexpected event network-vif-plugged-0b36b1a4-9ab6-49cb-9a5e-afc32792783e for instance with vm_state deleted and task_state None. Apr 20 16:11:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:41 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:44 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:11:44 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:11:44 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:11:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:11:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:11:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:11:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:11:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6766bf80-e99c-4363-b229-84049f21b1a2/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:11:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6766bf80-e99c-4363-b229-84049f21b1a2/disk --force-share --output=json" returned: 0 in 0.150s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:11:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6766bf80-e99c-4363-b229-84049f21b1a2/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:11:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6766bf80-e99c-4363-b229-84049f21b1a2/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:11:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:11:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:11:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:11:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:11:45 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:11:45 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:11:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=8905MB free_disk=26.345783233642578GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:11:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:11:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:11:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance 15d42ba7-cf47-4374-83b5-06d5242951b7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:11:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance 6766bf80-e99c-4363-b229-84049f21b1a2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:11:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:11:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:11:45 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:11:45 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:11:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:11:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.225s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:11:45 user nova-compute[71605]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:11:45 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] VM Stopped (Lifecycle Event) Apr 20 16:11:46 user nova-compute[71605]: DEBUG nova.compute.manager [None req-060e02c0-1f3e-43ed-b0dd-49de33f1359b None None] [instance: d4ea4d29-b178-4da2-b971-76f97031b244] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:11:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:46 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:11:46 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:11:46 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:11:46 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:11:47 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:11:49 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:11:49 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:11:49 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Rebuilding the list of instances to heal {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 16:11:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "refresh_cache-15d42ba7-cf47-4374-83b5-06d5242951b7" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:11:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquired lock "refresh_cache-15d42ba7-cf47-4374-83b5-06d5242951b7" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:11:49 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Forcefully refreshing network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 16:11:49 user nova-compute[71605]: DEBUG nova.objects.instance [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lazy-loading 'info_cache' on Instance uuid 15d42ba7-cf47-4374-83b5-06d5242951b7 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:11:49 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Updating instance_info_cache with network_info: [{"id": "e068d7e5-dc70-4b18-8dd6-5726f7a3bc84", "address": "fa:16:3e:15:a2:f4", "network": {"id": "9de26342-0f6c-4d7d-96a5-d4ad35573211", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1378273293-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.9", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fbd2a72dddad4f2892243a33df4fa2d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape068d7e5-dc", "ovs_interfaceid": "e068d7e5-dc70-4b18-8dd6-5726f7a3bc84", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:11:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Releasing lock "refresh_cache-15d42ba7-cf47-4374-83b5-06d5242951b7" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:11:49 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Updated the network info_cache for instance {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 16:11:50 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:11:51 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:11:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:12:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:12:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:12:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:12:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:12:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:12:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:12:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:12:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:12:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:12:21 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:12:21 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:12:21 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:12:21 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:12:21 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:12:21 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:12:21 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:12:23 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:12:24 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:12:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:12:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:12:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:12:41 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:12:44 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:12:44 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:12:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:12:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:12:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:12:44 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:12:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6766bf80-e99c-4363-b229-84049f21b1a2/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:12:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6766bf80-e99c-4363-b229-84049f21b1a2/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:12:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6766bf80-e99c-4363-b229-84049f21b1a2/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:12:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/6766bf80-e99c-4363-b229-84049f21b1a2/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:12:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:12:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json" returned: 0 in 0.158s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:12:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:12:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json" returned: 0 in 0.151s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:12:45 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:12:45 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:12:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=8924MB free_disk=26.3433837890625GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:12:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:12:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:12:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance 15d42ba7-cf47-4374-83b5-06d5242951b7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:12:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance 6766bf80-e99c-4363-b229-84049f21b1a2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:12:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:12:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:12:45 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:12:45 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:12:45 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:12:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.187s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:12:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:12:47 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:12:47 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:12:47 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:12:47 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:12:47 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:12:49 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:12:51 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:12:51 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:12:51 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:12:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "refresh_cache-6766bf80-e99c-4363-b229-84049f21b1a2" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:12:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquired lock "refresh_cache-6766bf80-e99c-4363-b229-84049f21b1a2" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:12:51 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Forcefully refreshing network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 16:12:51 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Updating instance_info_cache with network_info: [{"id": "6ee15985-c252-4270-ad30-f6f5efe94b86", "address": "fa:16:3e:7f:f1:65", "network": {"id": "9ac82ace-9d10-48a7-8d31-7e6637f442b4", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-98485346-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3cc4f384caf14fb3baff01938400bb4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ee15985-c2", "ovs_interfaceid": "6ee15985-c252-4270-ad30-f6f5efe94b86", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:12:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Releasing lock "refresh_cache-6766bf80-e99c-4363-b229-84049f21b1a2" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:12:51 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Updated the network info_cache for instance {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 16:12:52 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:12:52 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:12:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-535da986-0849-4a33-ac40-3691c303e59c tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Acquiring lock "6766bf80-e99c-4363-b229-84049f21b1a2" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-535da986-0849-4a33-ac40-3691c303e59c tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Lock "6766bf80-e99c-4363-b229-84049f21b1a2" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-535da986-0849-4a33-ac40-3691c303e59c tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Acquiring lock "6766bf80-e99c-4363-b229-84049f21b1a2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-535da986-0849-4a33-ac40-3691c303e59c tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Lock "6766bf80-e99c-4363-b229-84049f21b1a2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-535da986-0849-4a33-ac40-3691c303e59c tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Lock "6766bf80-e99c-4363-b229-84049f21b1a2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:13:01 user nova-compute[71605]: INFO nova.compute.manager [None req-535da986-0849-4a33-ac40-3691c303e59c tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Terminating instance Apr 20 16:13:01 user nova-compute[71605]: DEBUG nova.compute.manager [None req-535da986-0849-4a33-ac40-3691c303e59c tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Start destroying the instance on the hypervisor. {{(pid=71605) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG nova.compute.manager [req-675631a7-3d3d-4f2c-a7cb-8a940a5a1732 req-59266d7c-2b32-4c82-bcd1-355f645899d4 service nova] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Received event network-vif-unplugged-6ee15985-c252-4270-ad30-f6f5efe94b86 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-675631a7-3d3d-4f2c-a7cb-8a940a5a1732 req-59266d7c-2b32-4c82-bcd1-355f645899d4 service nova] Acquiring lock "6766bf80-e99c-4363-b229-84049f21b1a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-675631a7-3d3d-4f2c-a7cb-8a940a5a1732 req-59266d7c-2b32-4c82-bcd1-355f645899d4 service nova] Lock "6766bf80-e99c-4363-b229-84049f21b1a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-675631a7-3d3d-4f2c-a7cb-8a940a5a1732 req-59266d7c-2b32-4c82-bcd1-355f645899d4 service nova] Lock "6766bf80-e99c-4363-b229-84049f21b1a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG nova.compute.manager [req-675631a7-3d3d-4f2c-a7cb-8a940a5a1732 req-59266d7c-2b32-4c82-bcd1-355f645899d4 service nova] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] No waiting events found dispatching network-vif-unplugged-6ee15985-c252-4270-ad30-f6f5efe94b86 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG nova.compute.manager [req-675631a7-3d3d-4f2c-a7cb-8a940a5a1732 req-59266d7c-2b32-4c82-bcd1-355f645899d4 service nova] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Received event network-vif-unplugged-6ee15985-c252-4270-ad30-f6f5efe94b86 for instance with task_state deleting. {{(pid=71605) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:13:01 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Instance destroyed successfully. Apr 20 16:13:01 user nova-compute[71605]: DEBUG nova.objects.instance [None req-535da986-0849-4a33-ac40-3691c303e59c tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Lazy-loading 'resources' on Instance uuid 6766bf80-e99c-4363-b229-84049f21b1a2 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-535da986-0849-4a33-ac40-3691c303e59c tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:11:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-SnapshotDataIntegrityTests-server-771415346',display_name='tempest-SnapshotDataIntegrityTests-server-771415346',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-snapshotdataintegritytests-server-771415346',id=19,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL7kqNWnEn2hJiKbn+roVzFAUQv+vrvS2keY49vWi3kz6H/X1mSRhv0lb/LsmmiG61nDcYpkJebf4uXL/6YHuVK1fQRP8/2MwnEkEV1Hk939UStYtsqYSVjo2UDwQtWlYg==',key_name='tempest-SnapshotDataIntegrityTests-1504266769',keypairs=,launch_index=0,launched_at=2023-04-20T16:11:14Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='3cc4f384caf14fb3baff01938400bb4b',ramdisk_id='',reservation_id='r-gi8180jx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-SnapshotDataIntegrityTests-901842767',owner_user_name='tempest-SnapshotDataIntegrityTests-901842767-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T16:11:15Z,user_data=None,user_id='851a3140769443e4b62b73b987e2b417',uuid=6766bf80-e99c-4363-b229-84049f21b1a2,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6ee15985-c252-4270-ad30-f6f5efe94b86", "address": "fa:16:3e:7f:f1:65", "network": {"id": "9ac82ace-9d10-48a7-8d31-7e6637f442b4", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-98485346-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3cc4f384caf14fb3baff01938400bb4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ee15985-c2", "ovs_interfaceid": "6ee15985-c252-4270-ad30-f6f5efe94b86", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-535da986-0849-4a33-ac40-3691c303e59c tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Converting VIF {"id": "6ee15985-c252-4270-ad30-f6f5efe94b86", "address": "fa:16:3e:7f:f1:65", "network": {"id": "9ac82ace-9d10-48a7-8d31-7e6637f442b4", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-98485346-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "3cc4f384caf14fb3baff01938400bb4b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ee15985-c2", "ovs_interfaceid": "6ee15985-c252-4270-ad30-f6f5efe94b86", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-535da986-0849-4a33-ac40-3691c303e59c tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7f:f1:65,bridge_name='br-int',has_traffic_filtering=True,id=6ee15985-c252-4270-ad30-f6f5efe94b86,network=Network(9ac82ace-9d10-48a7-8d31-7e6637f442b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ee15985-c2') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG os_vif [None req-535da986-0849-4a33-ac40-3691c303e59c tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7f:f1:65,bridge_name='br-int',has_traffic_filtering=True,id=6ee15985-c252-4270-ad30-f6f5efe94b86,network=Network(9ac82ace-9d10-48a7-8d31-7e6637f442b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ee15985-c2') {{(pid=71605) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6ee15985-c2, bridge=br-int, if_exists=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:13:01 user nova-compute[71605]: INFO os_vif [None req-535da986-0849-4a33-ac40-3691c303e59c tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7f:f1:65,bridge_name='br-int',has_traffic_filtering=True,id=6ee15985-c252-4270-ad30-f6f5efe94b86,network=Network(9ac82ace-9d10-48a7-8d31-7e6637f442b4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ee15985-c2') Apr 20 16:13:01 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-535da986-0849-4a33-ac40-3691c303e59c tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Deleting instance files /opt/stack/data/nova/instances/6766bf80-e99c-4363-b229-84049f21b1a2_del Apr 20 16:13:01 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-535da986-0849-4a33-ac40-3691c303e59c tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Deletion of /opt/stack/data/nova/instances/6766bf80-e99c-4363-b229-84049f21b1a2_del complete Apr 20 16:13:01 user nova-compute[71605]: INFO nova.compute.manager [None req-535da986-0849-4a33-ac40-3691c303e59c tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Took 0.66 seconds to destroy the instance on the hypervisor. Apr 20 16:13:01 user nova-compute[71605]: DEBUG oslo.service.loopingcall [None req-535da986-0849-4a33-ac40-3691c303e59c tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71605) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG nova.compute.manager [-] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Deallocating network for instance {{(pid=71605) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 16:13:01 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] deallocate_for_instance() {{(pid=71605) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 16:13:02 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:13:02 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Took 0.84 seconds to deallocate network for instance. Apr 20 16:13:02 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-535da986-0849-4a33-ac40-3691c303e59c tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:13:02 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-535da986-0849-4a33-ac40-3691c303e59c tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:13:02 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-535da986-0849-4a33-ac40-3691c303e59c tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:13:02 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-535da986-0849-4a33-ac40-3691c303e59c tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:13:02 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-535da986-0849-4a33-ac40-3691c303e59c tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.138s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:13:02 user nova-compute[71605]: INFO nova.scheduler.client.report [None req-535da986-0849-4a33-ac40-3691c303e59c tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Deleted allocations for instance 6766bf80-e99c-4363-b229-84049f21b1a2 Apr 20 16:13:02 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-535da986-0849-4a33-ac40-3691c303e59c tempest-SnapshotDataIntegrityTests-901842767 tempest-SnapshotDataIntegrityTests-901842767-project-member] Lock "6766bf80-e99c-4363-b229-84049f21b1a2" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.816s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:13:03 user nova-compute[71605]: DEBUG nova.compute.manager [req-b8593d38-777b-4b75-b358-f8b148cec860 req-35dcb8ed-ca63-4f1b-b9f9-f730b87a41e9 service nova] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Received event network-vif-plugged-6ee15985-c252-4270-ad30-f6f5efe94b86 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:13:03 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-b8593d38-777b-4b75-b358-f8b148cec860 req-35dcb8ed-ca63-4f1b-b9f9-f730b87a41e9 service nova] Acquiring lock "6766bf80-e99c-4363-b229-84049f21b1a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:13:03 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-b8593d38-777b-4b75-b358-f8b148cec860 req-35dcb8ed-ca63-4f1b-b9f9-f730b87a41e9 service nova] Lock "6766bf80-e99c-4363-b229-84049f21b1a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:13:03 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-b8593d38-777b-4b75-b358-f8b148cec860 req-35dcb8ed-ca63-4f1b-b9f9-f730b87a41e9 service nova] Lock "6766bf80-e99c-4363-b229-84049f21b1a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:13:03 user nova-compute[71605]: DEBUG nova.compute.manager [req-b8593d38-777b-4b75-b358-f8b148cec860 req-35dcb8ed-ca63-4f1b-b9f9-f730b87a41e9 service nova] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] No waiting events found dispatching network-vif-plugged-6ee15985-c252-4270-ad30-f6f5efe94b86 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:13:03 user nova-compute[71605]: WARNING nova.compute.manager [req-b8593d38-777b-4b75-b358-f8b148cec860 req-35dcb8ed-ca63-4f1b-b9f9-f730b87a41e9 service nova] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Received unexpected event network-vif-plugged-6ee15985-c252-4270-ad30-f6f5efe94b86 for instance with vm_state deleted and task_state None. Apr 20 16:13:03 user nova-compute[71605]: DEBUG nova.compute.manager [req-b8593d38-777b-4b75-b358-f8b148cec860 req-35dcb8ed-ca63-4f1b-b9f9-f730b87a41e9 service nova] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Received event network-vif-deleted-6ee15985-c252-4270-ad30-f6f5efe94b86 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:13:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:13:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:13:16 user nova-compute[71605]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:13:16 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] VM Stopped (Lifecycle Event) Apr 20 16:13:16 user nova-compute[71605]: DEBUG nova.compute.manager [None req-793b3123-0b16-424a-ab90-a270c3611d29 None None] [instance: 6766bf80-e99c-4363-b229-84049f21b1a2] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:13:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:13:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:13:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:13:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:13:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:13:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:13:21 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:13:24 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:13:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:13:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:13:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:13:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:13:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:13:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:13:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:13:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:13:41 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:13:44 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:13:44 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Cleaning up deleted instances {{(pid=71605) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 20 16:13:44 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] There are 0 instances to clean {{(pid=71605) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 20 16:13:46 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:13:46 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:13:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:13:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:13:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:13:46 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:13:46 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:13:46 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:13:46 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:13:46 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:13:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:13:47 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:13:47 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:13:47 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=9053MB free_disk=26.361835479736328GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:13:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:13:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:13:47 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance 15d42ba7-cf47-4374-83b5-06d5242951b7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:13:47 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:13:47 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:13:47 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:13:47 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:13:47 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:13:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.209s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:13:47 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:13:47 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Cleaning up deleted instances with incomplete migration {{(pid=71605) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 20 16:13:48 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:13:48 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:13:48 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:13:48 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:13:48 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:13:51 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:13:51 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:13:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:13:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:13:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:13:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:13:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:13:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:13:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:13:52 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:13:52 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:13:52 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Rebuilding the list of instances to heal {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 16:13:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "refresh_cache-15d42ba7-cf47-4374-83b5-06d5242951b7" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:13:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquired lock "refresh_cache-15d42ba7-cf47-4374-83b5-06d5242951b7" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:13:52 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Forcefully refreshing network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 16:13:52 user nova-compute[71605]: DEBUG nova.objects.instance [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lazy-loading 'info_cache' on Instance uuid 15d42ba7-cf47-4374-83b5-06d5242951b7 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:13:52 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Updating instance_info_cache with network_info: [{"id": "e068d7e5-dc70-4b18-8dd6-5726f7a3bc84", "address": "fa:16:3e:15:a2:f4", "network": {"id": "9de26342-0f6c-4d7d-96a5-d4ad35573211", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1378273293-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.9", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fbd2a72dddad4f2892243a33df4fa2d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape068d7e5-dc", "ovs_interfaceid": "e068d7e5-dc70-4b18-8dd6-5726f7a3bc84", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:13:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Releasing lock "refresh_cache-15d42ba7-cf47-4374-83b5-06d5242951b7" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:13:52 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Updated the network info_cache for instance {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 16:13:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:13:54 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:13:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:13:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:14:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:14:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:14:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:14:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:14:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:14:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:14:07 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:14:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:14:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:14:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:14:27 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:14:27 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:14:27 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:14:27 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:14:27 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:14:27 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:14:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:14:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:14:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:14:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:14:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:14:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:14:37 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:14:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:14:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-e3b68293-d83d-4b31-9546-37b2b0a13c44 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Acquiring lock "15d42ba7-cf47-4374-83b5-06d5242951b7" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:14:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-e3b68293-d83d-4b31-9546-37b2b0a13c44 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Lock "15d42ba7-cf47-4374-83b5-06d5242951b7" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:14:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-e3b68293-d83d-4b31-9546-37b2b0a13c44 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Acquiring lock "15d42ba7-cf47-4374-83b5-06d5242951b7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:14:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-e3b68293-d83d-4b31-9546-37b2b0a13c44 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Lock "15d42ba7-cf47-4374-83b5-06d5242951b7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:14:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-e3b68293-d83d-4b31-9546-37b2b0a13c44 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Lock "15d42ba7-cf47-4374-83b5-06d5242951b7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:14:42 user nova-compute[71605]: INFO nova.compute.manager [None req-e3b68293-d83d-4b31-9546-37b2b0a13c44 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Terminating instance Apr 20 16:14:42 user nova-compute[71605]: DEBUG nova.compute.manager [None req-e3b68293-d83d-4b31-9546-37b2b0a13c44 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Start destroying the instance on the hypervisor. {{(pid=71605) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 16:14:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:14:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:14:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:14:42 user nova-compute[71605]: DEBUG nova.compute.manager [req-6ff7fdda-7934-41e0-9529-fb84f712e8cc req-fdc2145a-c566-4fa6-a039-ddab666555cc service nova] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Received event network-vif-unplugged-e068d7e5-dc70-4b18-8dd6-5726f7a3bc84 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:14:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-6ff7fdda-7934-41e0-9529-fb84f712e8cc req-fdc2145a-c566-4fa6-a039-ddab666555cc service nova] Acquiring lock "15d42ba7-cf47-4374-83b5-06d5242951b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:14:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-6ff7fdda-7934-41e0-9529-fb84f712e8cc req-fdc2145a-c566-4fa6-a039-ddab666555cc service nova] Lock "15d42ba7-cf47-4374-83b5-06d5242951b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:14:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-6ff7fdda-7934-41e0-9529-fb84f712e8cc req-fdc2145a-c566-4fa6-a039-ddab666555cc service nova] Lock "15d42ba7-cf47-4374-83b5-06d5242951b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:14:42 user nova-compute[71605]: DEBUG nova.compute.manager [req-6ff7fdda-7934-41e0-9529-fb84f712e8cc req-fdc2145a-c566-4fa6-a039-ddab666555cc service nova] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] No waiting events found dispatching network-vif-unplugged-e068d7e5-dc70-4b18-8dd6-5726f7a3bc84 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:14:42 user nova-compute[71605]: DEBUG nova.compute.manager [req-6ff7fdda-7934-41e0-9529-fb84f712e8cc req-fdc2145a-c566-4fa6-a039-ddab666555cc service nova] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Received event network-vif-unplugged-e068d7e5-dc70-4b18-8dd6-5726f7a3bc84 for instance with task_state deleting. {{(pid=71605) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 16:14:43 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Instance destroyed successfully. Apr 20 16:14:43 user nova-compute[71605]: DEBUG nova.objects.instance [None req-e3b68293-d83d-4b31-9546-37b2b0a13c44 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Lazy-loading 'resources' on Instance uuid 15d42ba7-cf47-4374-83b5-06d5242951b7 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:14:43 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-e3b68293-d83d-4b31-9546-37b2b0a13c44 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:05:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-756269820',display_name='tempest-ServerActionsTestJSON-server-756269820',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serveractionstestjson-server-756269820',id=15,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMjwUpDWFcob5xB4VqDuXPX9FXT3Oo4If754w5lrosRMsv11HN44JSOF4mrro0tvAJdzBl68kfqgDpMmfJchN9rJpHKumya051JNHX7iD1cSwO0dYRTlSqqNhb1fgqIedQ==',key_name='tempest-keypair-1949594234',keypairs=,launch_index=0,launched_at=2023-04-20T16:06:01Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='fbd2a72dddad4f2892243a33df4fa2d1',ramdisk_id='',reservation_id='r-wsyefwrb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerActionsTestJSON-893965653',owner_user_name='tempest-ServerActionsTestJSON-893965653-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T16:06:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='dd6dee2194d04f45a81fd0ef45ca0632',uuid=15d42ba7-cf47-4374-83b5-06d5242951b7,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e068d7e5-dc70-4b18-8dd6-5726f7a3bc84", "address": "fa:16:3e:15:a2:f4", "network": {"id": "9de26342-0f6c-4d7d-96a5-d4ad35573211", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1378273293-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.9", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fbd2a72dddad4f2892243a33df4fa2d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape068d7e5-dc", "ovs_interfaceid": "e068d7e5-dc70-4b18-8dd6-5726f7a3bc84", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 16:14:43 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-e3b68293-d83d-4b31-9546-37b2b0a13c44 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Converting VIF {"id": "e068d7e5-dc70-4b18-8dd6-5726f7a3bc84", "address": "fa:16:3e:15:a2:f4", "network": {"id": "9de26342-0f6c-4d7d-96a5-d4ad35573211", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1378273293-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.9", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fbd2a72dddad4f2892243a33df4fa2d1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape068d7e5-dc", "ovs_interfaceid": "e068d7e5-dc70-4b18-8dd6-5726f7a3bc84", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:14:43 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-e3b68293-d83d-4b31-9546-37b2b0a13c44 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:15:a2:f4,bridge_name='br-int',has_traffic_filtering=True,id=e068d7e5-dc70-4b18-8dd6-5726f7a3bc84,network=Network(9de26342-0f6c-4d7d-96a5-d4ad35573211),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape068d7e5-dc') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:14:43 user nova-compute[71605]: DEBUG os_vif [None req-e3b68293-d83d-4b31-9546-37b2b0a13c44 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:a2:f4,bridge_name='br-int',has_traffic_filtering=True,id=e068d7e5-dc70-4b18-8dd6-5726f7a3bc84,network=Network(9de26342-0f6c-4d7d-96a5-d4ad35573211),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape068d7e5-dc') {{(pid=71605) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 16:14:43 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:14:43 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape068d7e5-dc, bridge=br-int, if_exists=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:14:43 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:14:43 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:14:43 user nova-compute[71605]: INFO os_vif [None req-e3b68293-d83d-4b31-9546-37b2b0a13c44 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:a2:f4,bridge_name='br-int',has_traffic_filtering=True,id=e068d7e5-dc70-4b18-8dd6-5726f7a3bc84,network=Network(9de26342-0f6c-4d7d-96a5-d4ad35573211),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape068d7e5-dc') Apr 20 16:14:43 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-e3b68293-d83d-4b31-9546-37b2b0a13c44 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Deleting instance files /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7_del Apr 20 16:14:43 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-e3b68293-d83d-4b31-9546-37b2b0a13c44 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Deletion of /opt/stack/data/nova/instances/15d42ba7-cf47-4374-83b5-06d5242951b7_del complete Apr 20 16:14:43 user nova-compute[71605]: INFO nova.compute.manager [None req-e3b68293-d83d-4b31-9546-37b2b0a13c44 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Took 0.85 seconds to destroy the instance on the hypervisor. Apr 20 16:14:43 user nova-compute[71605]: DEBUG oslo.service.loopingcall [None req-e3b68293-d83d-4b31-9546-37b2b0a13c44 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71605) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 16:14:43 user nova-compute[71605]: DEBUG nova.compute.manager [-] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Deallocating network for instance {{(pid=71605) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 16:14:43 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] deallocate_for_instance() {{(pid=71605) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 16:14:44 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:14:44 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Took 0.77 seconds to deallocate network for instance. Apr 20 16:14:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-e3b68293-d83d-4b31-9546-37b2b0a13c44 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:14:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-e3b68293-d83d-4b31-9546-37b2b0a13c44 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:14:44 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-e3b68293-d83d-4b31-9546-37b2b0a13c44 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Refreshing inventories for resource provider 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 20 16:14:44 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-e3b68293-d83d-4b31-9546-37b2b0a13c44 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Updating ProviderTree inventory for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 20 16:14:44 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-e3b68293-d83d-4b31-9546-37b2b0a13c44 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Updating inventory in ProviderTree for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 20 16:14:44 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-e3b68293-d83d-4b31-9546-37b2b0a13c44 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Refreshing aggregate associations for resource provider 00e9f769-1a1c-4f1e-80e4-b19657803102, aggregates: None {{(pid=71605) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 20 16:14:44 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-e3b68293-d83d-4b31-9546-37b2b0a13c44 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Refreshing trait associations for resource provider 00e9f769-1a1c-4f1e-80e4-b19657803102, traits: COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_MMX,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI {{(pid=71605) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 20 16:14:44 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-e3b68293-d83d-4b31-9546-37b2b0a13c44 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:14:44 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-e3b68293-d83d-4b31-9546-37b2b0a13c44 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:14:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-e3b68293-d83d-4b31-9546-37b2b0a13c44 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.419s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:14:44 user nova-compute[71605]: INFO nova.scheduler.client.report [None req-e3b68293-d83d-4b31-9546-37b2b0a13c44 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Deleted allocations for instance 15d42ba7-cf47-4374-83b5-06d5242951b7 Apr 20 16:14:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-e3b68293-d83d-4b31-9546-37b2b0a13c44 tempest-ServerActionsTestJSON-893965653 tempest-ServerActionsTestJSON-893965653-project-member] Lock "15d42ba7-cf47-4374-83b5-06d5242951b7" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.209s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:14:44 user nova-compute[71605]: DEBUG nova.compute.manager [req-546dda97-69c9-4f5d-9a48-cb6d91761b2a req-1c32ee3a-b2bd-4885-81af-cf2ee72daa19 service nova] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Received event network-vif-plugged-e068d7e5-dc70-4b18-8dd6-5726f7a3bc84 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:14:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-546dda97-69c9-4f5d-9a48-cb6d91761b2a req-1c32ee3a-b2bd-4885-81af-cf2ee72daa19 service nova] Acquiring lock "15d42ba7-cf47-4374-83b5-06d5242951b7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:14:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-546dda97-69c9-4f5d-9a48-cb6d91761b2a req-1c32ee3a-b2bd-4885-81af-cf2ee72daa19 service nova] Lock "15d42ba7-cf47-4374-83b5-06d5242951b7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:14:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-546dda97-69c9-4f5d-9a48-cb6d91761b2a req-1c32ee3a-b2bd-4885-81af-cf2ee72daa19 service nova] Lock "15d42ba7-cf47-4374-83b5-06d5242951b7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:14:44 user nova-compute[71605]: DEBUG nova.compute.manager [req-546dda97-69c9-4f5d-9a48-cb6d91761b2a req-1c32ee3a-b2bd-4885-81af-cf2ee72daa19 service nova] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] No waiting events found dispatching network-vif-plugged-e068d7e5-dc70-4b18-8dd6-5726f7a3bc84 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:14:44 user nova-compute[71605]: WARNING nova.compute.manager [req-546dda97-69c9-4f5d-9a48-cb6d91761b2a req-1c32ee3a-b2bd-4885-81af-cf2ee72daa19 service nova] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Received unexpected event network-vif-plugged-e068d7e5-dc70-4b18-8dd6-5726f7a3bc84 for instance with vm_state deleted and task_state None. Apr 20 16:14:44 user nova-compute[71605]: DEBUG nova.compute.manager [req-546dda97-69c9-4f5d-9a48-cb6d91761b2a req-1c32ee3a-b2bd-4885-81af-cf2ee72daa19 service nova] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Received event network-vif-deleted-e068d7e5-dc70-4b18-8dd6-5726f7a3bc84 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:14:47 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:14:47 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:14:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:14:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:14:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:14:47 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:14:47 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:14:47 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:14:47 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=9200MB free_disk=26.380245208740234GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:14:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:14:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:14:47 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:14:47 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:14:47 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:14:47 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:14:47 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:14:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.129s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:14:48 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:14:48 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:14:48 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:14:48 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:14:49 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:14:49 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:14:51 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:14:52 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:14:52 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:14:52 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:14:52 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Rebuilding the list of instances to heal {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 16:14:52 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Didn't find any instances for network info cache update. {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 20 16:14:53 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:14:54 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:14:58 user nova-compute[71605]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:14:58 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] VM Stopped (Lifecycle Event) Apr 20 16:14:58 user nova-compute[71605]: DEBUG nova.compute.manager [None req-298eaa10-4a0b-4e9d-869b-54e2c09d0994 None None] [instance: 15d42ba7-cf47-4374-83b5-06d5242951b7] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:14:58 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:14:58 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:14:58 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:14:58 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:14:58 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:14:58 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:15:03 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:15:08 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:15:13 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:15:18 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:15:23 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:15:28 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:15:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:15:33 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:15:35 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:15:38 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:15:41 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:15:43 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:15:43 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:15:44 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:15:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquiring lock "cabd55bf-46c4-41be-942d-b6563f6b2778" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:15:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "cabd55bf-46c4-41be-942d-b6563f6b2778" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:15:44 user nova-compute[71605]: DEBUG nova.compute.manager [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Starting instance... {{(pid=71605) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 16:15:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:15:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:15:45 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71605) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 16:15:45 user nova-compute[71605]: INFO nova.compute.claims [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Claim successful on node user Apr 20 16:15:45 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:15:45 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:15:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.187s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:15:45 user nova-compute[71605]: DEBUG nova.compute.manager [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Start building networks asynchronously for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 16:15:45 user nova-compute[71605]: DEBUG nova.compute.manager [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Allocating IP information in the background. {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 16:15:45 user nova-compute[71605]: DEBUG nova.network.neutron [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] allocate_for_instance() {{(pid=71605) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 16:15:45 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 16:15:45 user nova-compute[71605]: DEBUG nova.compute.manager [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Start building block device mappings for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 16:15:45 user nova-compute[71605]: DEBUG nova.policy [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8c79a05e12ae4aab91bc79d32b02ef46', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8f978ad5201e412894f30daa8e2bd2e8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71605) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 16:15:45 user nova-compute[71605]: DEBUG nova.compute.manager [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Start spawning the instance on the hypervisor. {{(pid=71605) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 16:15:45 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Creating instance directory {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 16:15:45 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Creating image(s) Apr 20 16:15:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquiring lock "/opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:15:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "/opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:15:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "/opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.017s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:15:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:15:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.133s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:15:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquiring lock "4030659dc9e6940e4f224066d06e3784b1229890" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:15:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:15:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:15:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.142s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:15:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk 1073741824 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:15:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk 1073741824" returned: 0 in 0.085s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:15:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.233s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:15:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:15:46 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.132s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:15:46 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Checking if we can resize image /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk. size=1073741824 {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 16:15:46 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:15:46 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:15:46 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Cannot resize image /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk to a smaller size. {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 16:15:46 user nova-compute[71605]: DEBUG nova.objects.instance [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lazy-loading 'migration_context' on Instance uuid cabd55bf-46c4-41be-942d-b6563f6b2778 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:15:46 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Created local disks {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 16:15:46 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Ensure instance console log exists: /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/console.log {{(pid=71605) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 16:15:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:15:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:15:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:15:46 user nova-compute[71605]: DEBUG nova.network.neutron [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Successfully created port: 51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff {{(pid=71605) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.network.neutron [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Successfully updated port: 51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff {{(pid=71605) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.compute.manager [req-11ade07e-2e9a-41b3-b5fe-f96f23e22c43 req-6f3402ea-0c02-4f28-acf6-dce3d56b75fb service nova] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Received event network-changed-51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.compute.manager [req-11ade07e-2e9a-41b3-b5fe-f96f23e22c43 req-6f3402ea-0c02-4f28-acf6-dce3d56b75fb service nova] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Refreshing instance network info cache due to event network-changed-51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-11ade07e-2e9a-41b3-b5fe-f96f23e22c43 req-6f3402ea-0c02-4f28-acf6-dce3d56b75fb service nova] Acquiring lock "refresh_cache-cabd55bf-46c4-41be-942d-b6563f6b2778" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-11ade07e-2e9a-41b3-b5fe-f96f23e22c43 req-6f3402ea-0c02-4f28-acf6-dce3d56b75fb service nova] Acquired lock "refresh_cache-cabd55bf-46c4-41be-942d-b6563f6b2778" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.network.neutron [req-11ade07e-2e9a-41b3-b5fe-f96f23e22c43 req-6f3402ea-0c02-4f28-acf6-dce3d56b75fb service nova] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Refreshing network info cache for port 51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.network.neutron [req-11ade07e-2e9a-41b3-b5fe-f96f23e22c43 req-6f3402ea-0c02-4f28-acf6-dce3d56b75fb service nova] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Instance cache missing network info. {{(pid=71605) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquiring lock "refresh_cache-cabd55bf-46c4-41be-942d-b6563f6b2778" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.network.neutron [req-11ade07e-2e9a-41b3-b5fe-f96f23e22c43 req-6f3402ea-0c02-4f28-acf6-dce3d56b75fb service nova] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-11ade07e-2e9a-41b3-b5fe-f96f23e22c43 req-6f3402ea-0c02-4f28-acf6-dce3d56b75fb service nova] Releasing lock "refresh_cache-cabd55bf-46c4-41be-942d-b6563f6b2778" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquired lock "refresh_cache-cabd55bf-46c4-41be-942d-b6563f6b2778" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.network.neutron [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Building network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.network.neutron [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Instance cache missing network info. {{(pid=71605) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.network.neutron [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Updating instance_info_cache with network_info: [{"id": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "address": "fa:16:3e:5f:14:92", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap51a9dc4c-8c", "ovs_interfaceid": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Releasing lock "refresh_cache-cabd55bf-46c4-41be-942d-b6563f6b2778" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.compute.manager [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Instance network_info: |[{"id": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "address": "fa:16:3e:5f:14:92", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap51a9dc4c-8c", "ovs_interfaceid": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Start _get_guest_xml network_info=[{"id": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "address": "fa:16:3e:5f:14:92", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap51a9dc4c-8c", "ovs_interfaceid": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'size': 0, 'encryption_format': None, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'image_id': '4ac69ea5-e5d7-40c8-864e-0a164d78a727'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 16:15:47 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:15:47 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71605) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T16:00:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=), allow threads: True {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Flavor limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Image limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Flavor pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Image pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Got 1 possible topologies {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:15:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1055967206',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1055967206',id=20,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f978ad5201e412894f30daa8e2bd2e8',ramdisk_id='',reservation_id='r-0vah7prs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-2108053043',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:15:45Z,user_data=None,user_id='8c79a05e12ae4aab91bc79d32b02ef46',uuid=cabd55bf-46c4-41be-942d-b6563f6b2778,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "address": "fa:16:3e:5f:14:92", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap51a9dc4c-8c", "ovs_interfaceid": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71605) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Converting VIF {"id": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "address": "fa:16:3e:5f:14:92", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap51a9dc4c-8c", "ovs_interfaceid": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:14:92,bridge_name='br-int',has_traffic_filtering=True,id=51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff,network=Network(110d8e20-360f-48b7-8b42-9ae9760d39b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51a9dc4c-8c') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.objects.instance [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lazy-loading 'pci_devices' on Instance uuid cabd55bf-46c4-41be-942d-b6563f6b2778 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] End _get_guest_xml xml= Apr 20 16:15:47 user nova-compute[71605]: cabd55bf-46c4-41be-942d-b6563f6b2778 Apr 20 16:15:47 user nova-compute[71605]: instance-00000014 Apr 20 16:15:47 user nova-compute[71605]: 131072 Apr 20 16:15:47 user nova-compute[71605]: 1 Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: tempest-ServerBootFromVolumeStableRescueTest-server-1055967206 Apr 20 16:15:47 user nova-compute[71605]: 2023-04-20 16:15:47 Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: 128 Apr 20 16:15:47 user nova-compute[71605]: 1 Apr 20 16:15:47 user nova-compute[71605]: 0 Apr 20 16:15:47 user nova-compute[71605]: 0 Apr 20 16:15:47 user nova-compute[71605]: 1 Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member Apr 20 16:15:47 user nova-compute[71605]: tempest-ServerBootFromVolumeStableRescueTest-2108053043 Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: OpenStack Foundation Apr 20 16:15:47 user nova-compute[71605]: OpenStack Nova Apr 20 16:15:47 user nova-compute[71605]: 0.0.0 Apr 20 16:15:47 user nova-compute[71605]: cabd55bf-46c4-41be-942d-b6563f6b2778 Apr 20 16:15:47 user nova-compute[71605]: cabd55bf-46c4-41be-942d-b6563f6b2778 Apr 20 16:15:47 user nova-compute[71605]: Virtual Machine Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: hvm Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Nehalem Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: /dev/urandom Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: Apr 20 16:15:47 user nova-compute[71605]: {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:15:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1055967206',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1055967206',id=20,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f978ad5201e412894f30daa8e2bd2e8',ramdisk_id='',reservation_id='r-0vah7prs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-2108053043',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:15:45Z,user_data=None,user_id='8c79a05e12ae4aab91bc79d32b02ef46',uuid=cabd55bf-46c4-41be-942d-b6563f6b2778,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "address": "fa:16:3e:5f:14:92", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap51a9dc4c-8c", "ovs_interfaceid": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Converting VIF {"id": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "address": "fa:16:3e:5f:14:92", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap51a9dc4c-8c", "ovs_interfaceid": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:14:92,bridge_name='br-int',has_traffic_filtering=True,id=51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff,network=Network(110d8e20-360f-48b7-8b42-9ae9760d39b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51a9dc4c-8c') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG os_vif [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:14:92,bridge_name='br-int',has_traffic_filtering=True,id=51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff,network=Network(110d8e20-360f-48b7-8b42-9ae9760d39b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51a9dc4c-8c') {{(pid=71605) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap51a9dc4c-8c, may_exist=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap51a9dc4c-8c, col_values=(('external_ids', {'iface-id': '51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5f:14:92', 'vm-uuid': 'cabd55bf-46c4-41be-942d-b6563f6b2778'}),)) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:15:47 user nova-compute[71605]: INFO os_vif [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:14:92,bridge_name='br-int',has_traffic_filtering=True,id=51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff,network=Network(110d8e20-360f-48b7-8b42-9ae9760d39b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51a9dc4c-8c') Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] No BDM found with device name vda, not building metadata. {{(pid=71605) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 16:15:47 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] No VIF found with MAC fa:16:3e:5f:14:92, not building metadata {{(pid=71605) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 16:15:48 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:15:48 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:15:49 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:15:49 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:15:49 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:15:49 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:15:49 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:15:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:15:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:15:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:15:49 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:15:49 user nova-compute[71605]: DEBUG nova.compute.manager [req-604aeda5-81d2-47bb-8ea2-06df8ffe9036 req-e59a8903-b6af-4e42-8fe3-e54987b50c3f service nova] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Received event network-vif-plugged-51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:15:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-604aeda5-81d2-47bb-8ea2-06df8ffe9036 req-e59a8903-b6af-4e42-8fe3-e54987b50c3f service nova] Acquiring lock "cabd55bf-46c4-41be-942d-b6563f6b2778-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:15:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-604aeda5-81d2-47bb-8ea2-06df8ffe9036 req-e59a8903-b6af-4e42-8fe3-e54987b50c3f service nova] Lock "cabd55bf-46c4-41be-942d-b6563f6b2778-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:15:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-604aeda5-81d2-47bb-8ea2-06df8ffe9036 req-e59a8903-b6af-4e42-8fe3-e54987b50c3f service nova] Lock "cabd55bf-46c4-41be-942d-b6563f6b2778-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:15:49 user nova-compute[71605]: DEBUG nova.compute.manager [req-604aeda5-81d2-47bb-8ea2-06df8ffe9036 req-e59a8903-b6af-4e42-8fe3-e54987b50c3f service nova] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] No waiting events found dispatching network-vif-plugged-51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:15:49 user nova-compute[71605]: WARNING nova.compute.manager [req-604aeda5-81d2-47bb-8ea2-06df8ffe9036 req-e59a8903-b6af-4e42-8fe3-e54987b50c3f service nova] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Received unexpected event network-vif-plugged-51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff for instance with vm_state building and task_state spawning. Apr 20 16:15:49 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:15:49 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:15:49 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:15:49 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:15:49 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:15:49 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:15:49 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:15:51 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Resumed> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:15:51 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] VM Resumed (Lifecycle Event) Apr 20 16:15:51 user nova-compute[71605]: DEBUG nova.compute.manager [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Instance event wait completed in 0 seconds for {{(pid=71605) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 16:15:51 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Guest created on hypervisor {{(pid=71605) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 16:15:51 user nova-compute[71605]: DEBUG nova.compute.manager [req-8d171b31-7fd2-4e3e-90d1-3f45bcdf70f2 req-1f2d629f-acc3-4a75-9638-7b8e29d9fcb1 service nova] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Received event network-vif-plugged-51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:15:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-8d171b31-7fd2-4e3e-90d1-3f45bcdf70f2 req-1f2d629f-acc3-4a75-9638-7b8e29d9fcb1 service nova] Acquiring lock "cabd55bf-46c4-41be-942d-b6563f6b2778-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:15:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-8d171b31-7fd2-4e3e-90d1-3f45bcdf70f2 req-1f2d629f-acc3-4a75-9638-7b8e29d9fcb1 service nova] Lock "cabd55bf-46c4-41be-942d-b6563f6b2778-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:15:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-8d171b31-7fd2-4e3e-90d1-3f45bcdf70f2 req-1f2d629f-acc3-4a75-9638-7b8e29d9fcb1 service nova] Lock "cabd55bf-46c4-41be-942d-b6563f6b2778-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:15:51 user nova-compute[71605]: DEBUG nova.compute.manager [req-8d171b31-7fd2-4e3e-90d1-3f45bcdf70f2 req-1f2d629f-acc3-4a75-9638-7b8e29d9fcb1 service nova] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] No waiting events found dispatching network-vif-plugged-51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:15:51 user nova-compute[71605]: WARNING nova.compute.manager [req-8d171b31-7fd2-4e3e-90d1-3f45bcdf70f2 req-1f2d629f-acc3-4a75-9638-7b8e29d9fcb1 service nova] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Received unexpected event network-vif-plugged-51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff for instance with vm_state building and task_state spawning. Apr 20 16:15:51 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:15:51 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Instance spawned successfully. Apr 20 16:15:51 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 16:15:51 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:15:51 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:15:51 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=9151MB free_disk=26.370521545410156GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:15:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:15:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:15:51 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:15:51 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:15:51 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Started> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:15:51 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] VM Started (Lifecycle Event) Apr 20 16:15:51 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Found default for hw_cdrom_bus of ide {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:15:51 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Found default for hw_disk_bus of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:15:51 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Found default for hw_input_bus of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:15:51 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Found default for hw_pointer_model of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:15:51 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Found default for hw_video_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:15:51 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Found default for hw_vif_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:15:51 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:15:51 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:15:51 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:15:51 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance cabd55bf-46c4-41be-942d-b6563f6b2778 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:15:51 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:15:51 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:15:51 user nova-compute[71605]: INFO nova.compute.manager [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Took 6.05 seconds to spawn the instance on the hypervisor. Apr 20 16:15:51 user nova-compute[71605]: DEBUG nova.compute.manager [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:15:51 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:15:51 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:15:51 user nova-compute[71605]: INFO nova.compute.manager [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Took 6.56 seconds to build instance. Apr 20 16:15:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-7334fc27-f319-43b1-988c-c7ba5593b806 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "cabd55bf-46c4-41be-942d-b6563f6b2778" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.649s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:15:51 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:15:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.196s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:15:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:15:53 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:15:53 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:15:53 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:15:54 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:15:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:15:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Rebuilding the list of instances to heal {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 16:15:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "refresh_cache-cabd55bf-46c4-41be-942d-b6563f6b2778" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:15:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquired lock "refresh_cache-cabd55bf-46c4-41be-942d-b6563f6b2778" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:15:54 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Forcefully refreshing network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 16:15:54 user nova-compute[71605]: DEBUG nova.objects.instance [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lazy-loading 'info_cache' on Instance uuid cabd55bf-46c4-41be-942d-b6563f6b2778 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:15:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:15:54 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Updating instance_info_cache with network_info: [{"id": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "address": "fa:16:3e:5f:14:92", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap51a9dc4c-8c", "ovs_interfaceid": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:15:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Releasing lock "refresh_cache-cabd55bf-46c4-41be-942d-b6563f6b2778" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:15:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Updated the network info_cache for instance {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 16:15:55 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:15:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:15:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:16:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:16:04 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:16:07 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:16:09 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:16:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:16:14 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:16:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:16:19 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:16:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:16:27 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:16:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:16:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:16:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:16:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:16:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:16:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:16:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:16:37 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:16:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:16:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:16:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:16:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:16:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:16:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:16:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:16:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:16:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:16:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:16:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:16:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:16:49 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:16:49 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:16:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:16:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:16:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:16:49 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:16:49 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:16:49 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json" returned: 0 in 0.169s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:16:49 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:16:49 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:16:50 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:16:50 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:16:50 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=9108MB free_disk=26.34982681274414GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:16:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:16:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:16:50 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance cabd55bf-46c4-41be-942d-b6563f6b2778 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:16:50 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:16:50 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:16:50 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:16:50 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:16:50 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:16:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.194s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:16:51 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:16:51 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:16:51 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:16:52 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:16:52 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:16:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:16:53 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:16:53 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:16:54 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:16:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:16:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Rebuilding the list of instances to heal {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 16:16:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "refresh_cache-cabd55bf-46c4-41be-942d-b6563f6b2778" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:16:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquired lock "refresh_cache-cabd55bf-46c4-41be-942d-b6563f6b2778" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:16:54 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Forcefully refreshing network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 16:16:54 user nova-compute[71605]: DEBUG nova.objects.instance [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lazy-loading 'info_cache' on Instance uuid cabd55bf-46c4-41be-942d-b6563f6b2778 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:16:54 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Updating instance_info_cache with network_info: [{"id": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "address": "fa:16:3e:5f:14:92", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap51a9dc4c-8c", "ovs_interfaceid": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:16:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Releasing lock "refresh_cache-cabd55bf-46c4-41be-942d-b6563f6b2778" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:16:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Updated the network info_cache for instance {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 16:16:57 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:16:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:16:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:16:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:16:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:16:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:16:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:17:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:17:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:17:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:17:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:17:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:17:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:17:04 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:17:07 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:17:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:17:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:17:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:17:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:17:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:17:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:17:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:17:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:17:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:17:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:17:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:17:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:17:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:17:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:17:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:17:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:17:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:17:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:17:27 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:17:27 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:17:27 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:17:27 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:17:27 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:17:27 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:17:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:17:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:17:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:17:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:17:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:17:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:17:35 user nova-compute[71605]: DEBUG nova.compute.manager [None req-a6e87014-9235-4337-abc6-fe7e6a5fc1ee tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:17:35 user nova-compute[71605]: INFO nova.compute.manager [None req-a6e87014-9235-4337-abc6-fe7e6a5fc1ee tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] instance snapshotting Apr 20 16:17:36 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-a6e87014-9235-4337-abc6-fe7e6a5fc1ee tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Beginning live snapshot process Apr 20 16:17:36 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-a6e87014-9235-4337-abc6-fe7e6a5fc1ee tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json -f qcow2 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:17:36 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-a6e87014-9235-4337-abc6-fe7e6a5fc1ee tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json -f qcow2" returned: 0 in 0.137s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:17:36 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-a6e87014-9235-4337-abc6-fe7e6a5fc1ee tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json -f qcow2 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:17:36 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-a6e87014-9235-4337-abc6-fe7e6a5fc1ee tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json -f qcow2" returned: 0 in 0.128s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:17:36 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-a6e87014-9235-4337-abc6-fe7e6a5fc1ee tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:17:36 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-a6e87014-9235-4337-abc6-fe7e6a5fc1ee tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.141s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:17:36 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-a6e87014-9235-4337-abc6-fe7e6a5fc1ee tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpvlscf0yo/077c213611c84614ac8e9e5b7876f65d.delta 1073741824 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:17:36 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-a6e87014-9235-4337-abc6-fe7e6a5fc1ee tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpvlscf0yo/077c213611c84614ac8e9e5b7876f65d.delta 1073741824" returned: 0 in 0.057s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:17:36 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-a6e87014-9235-4337-abc6-fe7e6a5fc1ee tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Quiescing instance not available: QEMU guest agent is not enabled. Apr 20 16:17:37 user nova-compute[71605]: DEBUG nova.virt.libvirt.guest [None req-a6e87014-9235-4337-abc6-fe7e6a5fc1ee tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] COPY block job progress, current cursor: 0 final cursor: 43778048 {{(pid=71605) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 20 16:17:37 user nova-compute[71605]: DEBUG nova.virt.libvirt.guest [None req-a6e87014-9235-4337-abc6-fe7e6a5fc1ee tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] COPY block job progress, current cursor: 43778048 final cursor: 43778048 {{(pid=71605) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 20 16:17:37 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-a6e87014-9235-4337-abc6-fe7e6a5fc1ee tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Skipping quiescing instance: QEMU guest agent is not enabled. Apr 20 16:17:37 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:17:37 user nova-compute[71605]: DEBUG nova.privsep.utils [None req-a6e87014-9235-4337-abc6-fe7e6a5fc1ee tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71605) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 20 16:17:37 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-a6e87014-9235-4337-abc6-fe7e6a5fc1ee tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpvlscf0yo/077c213611c84614ac8e9e5b7876f65d.delta /opt/stack/data/nova/instances/snapshots/tmpvlscf0yo/077c213611c84614ac8e9e5b7876f65d {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:17:38 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-a6e87014-9235-4337-abc6-fe7e6a5fc1ee tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpvlscf0yo/077c213611c84614ac8e9e5b7876f65d.delta /opt/stack/data/nova/instances/snapshots/tmpvlscf0yo/077c213611c84614ac8e9e5b7876f65d" returned: 0 in 0.452s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:17:38 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-a6e87014-9235-4337-abc6-fe7e6a5fc1ee tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Snapshot extracted, beginning image upload Apr 20 16:17:40 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-a6e87014-9235-4337-abc6-fe7e6a5fc1ee tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Snapshot image upload complete Apr 20 16:17:40 user nova-compute[71605]: INFO nova.compute.manager [None req-a6e87014-9235-4337-abc6-fe7e6a5fc1ee tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Took 4.64 seconds to snapshot the instance on the hypervisor. Apr 20 16:17:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:17:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:17:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:17:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:17:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:17:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:17:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:17:49 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:17:49 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:17:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:17:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:17:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:17:49 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:17:49 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:17:49 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:17:49 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:17:49 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:17:50 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:17:50 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:17:50 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=9109MB free_disk=26.313064575195312GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:17:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:17:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:17:50 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance cabd55bf-46c4-41be-942d-b6563f6b2778 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:17:50 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:17:50 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:17:50 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:17:50 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:17:50 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:17:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.189s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:17:51 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:17:52 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:17:52 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:17:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:17:53 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:17:54 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:17:54 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:17:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:17:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Rebuilding the list of instances to heal {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 16:17:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "refresh_cache-cabd55bf-46c4-41be-942d-b6563f6b2778" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:17:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquired lock "refresh_cache-cabd55bf-46c4-41be-942d-b6563f6b2778" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:17:54 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Forcefully refreshing network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 16:17:54 user nova-compute[71605]: DEBUG nova.objects.instance [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lazy-loading 'info_cache' on Instance uuid cabd55bf-46c4-41be-942d-b6563f6b2778 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:17:54 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Updating instance_info_cache with network_info: [{"id": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "address": "fa:16:3e:5f:14:92", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap51a9dc4c-8c", "ovs_interfaceid": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:17:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Releasing lock "refresh_cache-cabd55bf-46c4-41be-942d-b6563f6b2778" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:17:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Updated the network info_cache for instance {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 16:17:55 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:17:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:17:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:17:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:17:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:17:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:17:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:17:59 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:17:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:18:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:18:07 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:18:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:18:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:18:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:18:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:18:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:18:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:18:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:18:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:18:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:18:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:18:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:18:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:18:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:18:27 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:18:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:18:37 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:18:37 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:18:37 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:18:37 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:18:37 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:18:37 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:18:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquiring lock "21f02f05-1afe-4818-a4a9-14a4d6384eff" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:18:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "21f02f05-1afe-4818-a4a9-14a4d6384eff" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:18:42 user nova-compute[71605]: DEBUG nova.compute.manager [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Starting instance... {{(pid=71605) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 16:18:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:18:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:18:42 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71605) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 16:18:42 user nova-compute[71605]: INFO nova.compute.claims [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Claim successful on node user Apr 20 16:18:42 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:18:42 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:18:42 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.213s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:18:42 user nova-compute[71605]: DEBUG nova.compute.manager [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Start building networks asynchronously for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 16:18:42 user nova-compute[71605]: DEBUG nova.compute.manager [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Allocating IP information in the background. {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 16:18:42 user nova-compute[71605]: DEBUG nova.network.neutron [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] allocate_for_instance() {{(pid=71605) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 16:18:42 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 16:18:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:18:42 user nova-compute[71605]: DEBUG nova.compute.manager [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Start building block device mappings for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 16:18:42 user nova-compute[71605]: INFO nova.virt.block_device [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Booting with blank volume at /dev/vda Apr 20 16:18:43 user nova-compute[71605]: DEBUG nova.policy [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8c79a05e12ae4aab91bc79d32b02ef46', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8f978ad5201e412894f30daa8e2bd2e8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71605) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 16:18:43 user nova-compute[71605]: WARNING nova.compute.manager [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Volume id: 91b31061-792b-47e1-9cd6-24ce8fa212a3 finished being created but its status is error. Apr 20 16:18:43 user nova-compute[71605]: ERROR nova.compute.manager [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Instance failed block device setup: nova.exception.VolumeNotCreated: Volume 91b31061-792b-47e1-9cd6-24ce8fa212a3 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 20 16:18:43 user nova-compute[71605]: ERROR nova.compute.manager [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Traceback (most recent call last): Apr 20 16:18:43 user nova-compute[71605]: ERROR nova.compute.manager [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] File "/opt/stack/nova/nova/compute/manager.py", line 2175, in _prep_block_device Apr 20 16:18:43 user nova-compute[71605]: ERROR nova.compute.manager [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] driver_block_device.attach_block_devices( Apr 20 16:18:43 user nova-compute[71605]: ERROR nova.compute.manager [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] File "/opt/stack/nova/nova/virt/block_device.py", line 936, in attach_block_devices Apr 20 16:18:43 user nova-compute[71605]: ERROR nova.compute.manager [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] _log_and_attach(device) Apr 20 16:18:43 user nova-compute[71605]: ERROR nova.compute.manager [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] File "/opt/stack/nova/nova/virt/block_device.py", line 933, in _log_and_attach Apr 20 16:18:43 user nova-compute[71605]: ERROR nova.compute.manager [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] bdm.attach(*attach_args, **attach_kwargs) Apr 20 16:18:43 user nova-compute[71605]: ERROR nova.compute.manager [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] File "/opt/stack/nova/nova/virt/block_device.py", line 848, in attach Apr 20 16:18:43 user nova-compute[71605]: ERROR nova.compute.manager [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] self.volume_id, self.attachment_id = self._create_volume( Apr 20 16:18:43 user nova-compute[71605]: ERROR nova.compute.manager [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] File "/opt/stack/nova/nova/virt/block_device.py", line 435, in _create_volume Apr 20 16:18:43 user nova-compute[71605]: ERROR nova.compute.manager [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] self._call_wait_func(context, wait_func, volume_api, vol['id']) Apr 20 16:18:43 user nova-compute[71605]: ERROR nova.compute.manager [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] File "/opt/stack/nova/nova/virt/block_device.py", line 785, in _call_wait_func Apr 20 16:18:43 user nova-compute[71605]: ERROR nova.compute.manager [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] with excutils.save_and_reraise_exception(): Apr 20 16:18:43 user nova-compute[71605]: ERROR nova.compute.manager [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ Apr 20 16:18:43 user nova-compute[71605]: ERROR nova.compute.manager [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] self.force_reraise() Apr 20 16:18:43 user nova-compute[71605]: ERROR nova.compute.manager [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise Apr 20 16:18:43 user nova-compute[71605]: ERROR nova.compute.manager [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] raise self.value Apr 20 16:18:43 user nova-compute[71605]: ERROR nova.compute.manager [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] File "/opt/stack/nova/nova/virt/block_device.py", line 783, in _call_wait_func Apr 20 16:18:43 user nova-compute[71605]: ERROR nova.compute.manager [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] wait_func(context, volume_id) Apr 20 16:18:43 user nova-compute[71605]: ERROR nova.compute.manager [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] File "/opt/stack/nova/nova/compute/manager.py", line 1792, in _await_block_device_map_created Apr 20 16:18:43 user nova-compute[71605]: ERROR nova.compute.manager [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] raise exception.VolumeNotCreated(volume_id=vol_id, Apr 20 16:18:43 user nova-compute[71605]: ERROR nova.compute.manager [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] nova.exception.VolumeNotCreated: Volume 91b31061-792b-47e1-9cd6-24ce8fa212a3 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 20 16:18:43 user nova-compute[71605]: ERROR nova.compute.manager [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Apr 20 16:18:43 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:18:44 user nova-compute[71605]: DEBUG nova.network.neutron [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Successfully created port: 2917d2ac-f6bb-462b-a7b4-3ed39a7fad08 {{(pid=71605) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 16:18:45 user nova-compute[71605]: DEBUG nova.network.neutron [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Successfully updated port: 2917d2ac-f6bb-462b-a7b4-3ed39a7fad08 {{(pid=71605) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 16:18:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquiring lock "refresh_cache-21f02f05-1afe-4818-a4a9-14a4d6384eff" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:18:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquired lock "refresh_cache-21f02f05-1afe-4818-a4a9-14a4d6384eff" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:18:45 user nova-compute[71605]: DEBUG nova.network.neutron [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Building network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 16:18:45 user nova-compute[71605]: DEBUG nova.compute.manager [req-abd42c01-687a-47dc-b35b-fff4a3758b95 req-4ba4f6be-ef26-469d-82dd-c60c8ae60a72 service nova] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Received event network-changed-2917d2ac-f6bb-462b-a7b4-3ed39a7fad08 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:18:45 user nova-compute[71605]: DEBUG nova.compute.manager [req-abd42c01-687a-47dc-b35b-fff4a3758b95 req-4ba4f6be-ef26-469d-82dd-c60c8ae60a72 service nova] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Refreshing instance network info cache due to event network-changed-2917d2ac-f6bb-462b-a7b4-3ed39a7fad08. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:18:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-abd42c01-687a-47dc-b35b-fff4a3758b95 req-4ba4f6be-ef26-469d-82dd-c60c8ae60a72 service nova] Acquiring lock "refresh_cache-21f02f05-1afe-4818-a4a9-14a4d6384eff" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:18:45 user nova-compute[71605]: DEBUG nova.network.neutron [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Instance cache missing network info. {{(pid=71605) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 16:18:45 user nova-compute[71605]: DEBUG nova.network.neutron [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Updating instance_info_cache with network_info: [{"id": "2917d2ac-f6bb-462b-a7b4-3ed39a7fad08", "address": "fa:16:3e:63:c9:a9", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2917d2ac-f6", "ovs_interfaceid": "2917d2ac-f6bb-462b-a7b4-3ed39a7fad08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:18:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Releasing lock "refresh_cache-21f02f05-1afe-4818-a4a9-14a4d6384eff" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:18:45 user nova-compute[71605]: DEBUG nova.compute.manager [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Instance network_info: |[{"id": "2917d2ac-f6bb-462b-a7b4-3ed39a7fad08", "address": "fa:16:3e:63:c9:a9", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2917d2ac-f6", "ovs_interfaceid": "2917d2ac-f6bb-462b-a7b4-3ed39a7fad08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 16:18:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-abd42c01-687a-47dc-b35b-fff4a3758b95 req-4ba4f6be-ef26-469d-82dd-c60c8ae60a72 service nova] Acquired lock "refresh_cache-21f02f05-1afe-4818-a4a9-14a4d6384eff" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:18:45 user nova-compute[71605]: DEBUG nova.network.neutron [req-abd42c01-687a-47dc-b35b-fff4a3758b95 req-4ba4f6be-ef26-469d-82dd-c60c8ae60a72 service nova] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Refreshing network info cache for port 2917d2ac-f6bb-462b-a7b4-3ed39a7fad08 {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:18:45 user nova-compute[71605]: DEBUG nova.compute.claims [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Aborting claim: {{(pid=71605) abort /opt/stack/nova/nova/compute/claims.py:84}} Apr 20 16:18:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:18:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:18:45 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:18:45 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:18:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.188s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:18:45 user nova-compute[71605]: DEBUG nova.compute.manager [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Build of instance 21f02f05-1afe-4818-a4a9-14a4d6384eff aborted: Volume 91b31061-792b-47e1-9cd6-24ce8fa212a3 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. {{(pid=71605) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2636}} Apr 20 16:18:46 user nova-compute[71605]: DEBUG nova.compute.utils [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Build of instance 21f02f05-1afe-4818-a4a9-14a4d6384eff aborted: Volume 91b31061-792b-47e1-9cd6-24ce8fa212a3 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. {{(pid=71605) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} Apr 20 16:18:46 user nova-compute[71605]: ERROR nova.compute.manager [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Build of instance 21f02f05-1afe-4818-a4a9-14a4d6384eff aborted: Volume 91b31061-792b-47e1-9cd6-24ce8fa212a3 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error.: nova.exception.BuildAbortException: Build of instance 21f02f05-1afe-4818-a4a9-14a4d6384eff aborted: Volume 91b31061-792b-47e1-9cd6-24ce8fa212a3 did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 20 16:18:46 user nova-compute[71605]: DEBUG nova.compute.manager [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Unplugging VIFs for instance {{(pid=71605) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} Apr 20 16:18:46 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:18:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1807434348',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1807434348',id=21,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f978ad5201e412894f30daa8e2bd2e8',ramdisk_id='',reservation_id='r-fo88hwmd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-2108053043',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member'},tags=TagList,task_state='block_device_mapping',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:18:43Z,user_data=None,user_id='8c79a05e12ae4aab91bc79d32b02ef46',uuid=21f02f05-1afe-4818-a4a9-14a4d6384eff,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2917d2ac-f6bb-462b-a7b4-3ed39a7fad08", "address": "fa:16:3e:63:c9:a9", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2917d2ac-f6", "ovs_interfaceid": "2917d2ac-f6bb-462b-a7b4-3ed39a7fad08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 16:18:46 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Converting VIF {"id": "2917d2ac-f6bb-462b-a7b4-3ed39a7fad08", "address": "fa:16:3e:63:c9:a9", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2917d2ac-f6", "ovs_interfaceid": "2917d2ac-f6bb-462b-a7b4-3ed39a7fad08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:18:46 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:c9:a9,bridge_name='br-int',has_traffic_filtering=True,id=2917d2ac-f6bb-462b-a7b4-3ed39a7fad08,network=Network(110d8e20-360f-48b7-8b42-9ae9760d39b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2917d2ac-f6') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:18:46 user nova-compute[71605]: DEBUG os_vif [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:c9:a9,bridge_name='br-int',has_traffic_filtering=True,id=2917d2ac-f6bb-462b-a7b4-3ed39a7fad08,network=Network(110d8e20-360f-48b7-8b42-9ae9760d39b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2917d2ac-f6') {{(pid=71605) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 16:18:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:18:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2917d2ac-f6, bridge=br-int, if_exists=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:18:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 16:18:46 user nova-compute[71605]: INFO os_vif [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:c9:a9,bridge_name='br-int',has_traffic_filtering=True,id=2917d2ac-f6bb-462b-a7b4-3ed39a7fad08,network=Network(110d8e20-360f-48b7-8b42-9ae9760d39b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2917d2ac-f6') Apr 20 16:18:46 user nova-compute[71605]: DEBUG nova.compute.manager [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Unplugged VIFs for instance {{(pid=71605) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} Apr 20 16:18:46 user nova-compute[71605]: DEBUG nova.compute.manager [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Deallocating network for instance {{(pid=71605) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 16:18:46 user nova-compute[71605]: DEBUG nova.network.neutron [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] deallocate_for_instance() {{(pid=71605) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 16:18:46 user nova-compute[71605]: DEBUG nova.network.neutron [req-abd42c01-687a-47dc-b35b-fff4a3758b95 req-4ba4f6be-ef26-469d-82dd-c60c8ae60a72 service nova] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Updated VIF entry in instance network info cache for port 2917d2ac-f6bb-462b-a7b4-3ed39a7fad08. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:18:46 user nova-compute[71605]: DEBUG nova.network.neutron [req-abd42c01-687a-47dc-b35b-fff4a3758b95 req-4ba4f6be-ef26-469d-82dd-c60c8ae60a72 service nova] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Updating instance_info_cache with network_info: [{"id": "2917d2ac-f6bb-462b-a7b4-3ed39a7fad08", "address": "fa:16:3e:63:c9:a9", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2917d2ac-f6", "ovs_interfaceid": "2917d2ac-f6bb-462b-a7b4-3ed39a7fad08", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:18:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-abd42c01-687a-47dc-b35b-fff4a3758b95 req-4ba4f6be-ef26-469d-82dd-c60c8ae60a72 service nova] Releasing lock "refresh_cache-21f02f05-1afe-4818-a4a9-14a4d6384eff" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:18:46 user nova-compute[71605]: DEBUG nova.network.neutron [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:18:46 user nova-compute[71605]: INFO nova.compute.manager [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 21f02f05-1afe-4818-a4a9-14a4d6384eff] Took 0.66 seconds to deallocate network for instance. Apr 20 16:18:46 user nova-compute[71605]: INFO nova.scheduler.client.report [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Deleted allocations for instance 21f02f05-1afe-4818-a4a9-14a4d6384eff Apr 20 16:18:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-95c6a2ca-22a3-4b01-bcd4-c0fb55468d22 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "21f02f05-1afe-4818-a4a9-14a4d6384eff" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 4.320s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:18:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:18:49 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:18:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:18:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:18:49 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:18:49 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:18:49 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:18:49 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:18:49 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:18:49 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:18:50 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:18:50 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:18:50 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=9115MB free_disk=26.312122344970703GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:18:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:18:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:18:50 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance cabd55bf-46c4-41be-942d-b6563f6b2778 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:18:50 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:18:50 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:18:50 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:18:50 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:18:50 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:18:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.203s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:18:52 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:18:52 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:18:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:18:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:18:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:18:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:18:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:18:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:18:53 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:18:53 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:18:54 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:18:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:18:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Rebuilding the list of instances to heal {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 16:18:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "refresh_cache-cabd55bf-46c4-41be-942d-b6563f6b2778" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:18:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquired lock "refresh_cache-cabd55bf-46c4-41be-942d-b6563f6b2778" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:18:54 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Forcefully refreshing network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 16:18:54 user nova-compute[71605]: DEBUG nova.objects.instance [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lazy-loading 'info_cache' on Instance uuid cabd55bf-46c4-41be-942d-b6563f6b2778 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:18:54 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Updating instance_info_cache with network_info: [{"id": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "address": "fa:16:3e:5f:14:92", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap51a9dc4c-8c", "ovs_interfaceid": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:18:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Releasing lock "refresh_cache-cabd55bf-46c4-41be-942d-b6563f6b2778" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:18:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Updated the network info_cache for instance {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 16:18:54 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:18:55 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:18:55 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:18:55 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Cleaning up deleted instances {{(pid=71605) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 20 16:18:55 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] There are 0 instances to clean {{(pid=71605) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 20 16:18:56 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:18:57 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:18:57 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:18:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:18:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:18:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:18:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:18:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:18:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:19:00 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:19:01 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:19:01 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Cleaning up deleted instances with incomplete migration {{(pid=71605) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 20 16:19:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:19:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:19:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:19:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:19:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:19:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:19:07 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._sync_power_states {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:19:07 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Triggering sync for uuid cabd55bf-46c4-41be-942d-b6563f6b2778 {{(pid=71605) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 20 16:19:07 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "cabd55bf-46c4-41be-942d-b6563f6b2778" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:19:07 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "cabd55bf-46c4-41be-942d-b6563f6b2778" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:19:07 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "cabd55bf-46c4-41be-942d-b6563f6b2778" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.024s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:19:07 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:19:07 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:19:07 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:19:07 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:19:07 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:19:07 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:19:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:19:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:19:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:19:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:19:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:19:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:19:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:19:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:19:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:19:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:19:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:19:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:19:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:19:27 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:19:27 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:19:27 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:19:27 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:19:27 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:19:27 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:19:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:19:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:19:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:19:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:19:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:19:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:19:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquiring lock "972fbea6-71af-4e33-9f9d-d82c46fcd564" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:19:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "972fbea6-71af-4e33-9f9d-d82c46fcd564" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:19:34 user nova-compute[71605]: DEBUG nova.compute.manager [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Starting instance... {{(pid=71605) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 16:19:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:19:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:19:34 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71605) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 16:19:34 user nova-compute[71605]: INFO nova.compute.claims [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Claim successful on node user Apr 20 16:19:34 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:19:34 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:19:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.221s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:19:34 user nova-compute[71605]: DEBUG nova.compute.manager [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Start building networks asynchronously for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 16:19:34 user nova-compute[71605]: DEBUG nova.compute.manager [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Allocating IP information in the background. {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 16:19:34 user nova-compute[71605]: DEBUG nova.network.neutron [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] allocate_for_instance() {{(pid=71605) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 16:19:34 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 16:19:34 user nova-compute[71605]: DEBUG nova.compute.manager [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Start building block device mappings for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 16:19:34 user nova-compute[71605]: DEBUG nova.policy [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8c79a05e12ae4aab91bc79d32b02ef46', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8f978ad5201e412894f30daa8e2bd2e8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71605) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 16:19:34 user nova-compute[71605]: DEBUG nova.compute.manager [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Start spawning the instance on the hypervisor. {{(pid=71605) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 16:19:34 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Creating instance directory {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 16:19:34 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Creating image(s) Apr 20 16:19:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquiring lock "/opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:19:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "/opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:19:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "/opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:19:34 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:19:35 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.133s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:19:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquiring lock "4030659dc9e6940e4f224066d06e3784b1229890" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:19:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:19:35 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:19:35 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.138s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:19:35 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/disk 1073741824 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:19:35 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/disk 1073741824" returned: 0 in 0.071s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:19:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "4030659dc9e6940e4f224066d06e3784b1229890" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.216s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:19:35 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:19:35 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.133s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:19:35 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Checking if we can resize image /opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/disk. size=1073741824 {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 16:19:35 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:19:35 user nova-compute[71605]: DEBUG nova.network.neutron [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Successfully created port: 5e633cb4-e056-4752-9478-8e180c9c6869 {{(pid=71605) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 16:19:35 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/disk --force-share --output=json" returned: 0 in 0.156s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:19:35 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Cannot resize image /opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/disk to a smaller size. {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 16:19:35 user nova-compute[71605]: DEBUG nova.objects.instance [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lazy-loading 'migration_context' on Instance uuid 972fbea6-71af-4e33-9f9d-d82c46fcd564 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:19:35 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Created local disks {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 16:19:35 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Ensure instance console log exists: /opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/console.log {{(pid=71605) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 16:19:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:19:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:19:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.network.neutron [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Successfully updated port: 5e633cb4-e056-4752-9478-8e180c9c6869 {{(pid=71605) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquiring lock "refresh_cache-972fbea6-71af-4e33-9f9d-d82c46fcd564" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquired lock "refresh_cache-972fbea6-71af-4e33-9f9d-d82c46fcd564" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.network.neutron [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Building network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.compute.manager [req-9581f701-1465-41fb-96b8-5e21618e4864 req-82664cce-96af-44bb-b829-5c73cf48013c service nova] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Received event network-changed-5e633cb4-e056-4752-9478-8e180c9c6869 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.compute.manager [req-9581f701-1465-41fb-96b8-5e21618e4864 req-82664cce-96af-44bb-b829-5c73cf48013c service nova] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Refreshing instance network info cache due to event network-changed-5e633cb4-e056-4752-9478-8e180c9c6869. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-9581f701-1465-41fb-96b8-5e21618e4864 req-82664cce-96af-44bb-b829-5c73cf48013c service nova] Acquiring lock "refresh_cache-972fbea6-71af-4e33-9f9d-d82c46fcd564" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.network.neutron [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Instance cache missing network info. {{(pid=71605) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.network.neutron [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Updating instance_info_cache with network_info: [{"id": "5e633cb4-e056-4752-9478-8e180c9c6869", "address": "fa:16:3e:9c:d7:a9", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e633cb4-e0", "ovs_interfaceid": "5e633cb4-e056-4752-9478-8e180c9c6869", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Releasing lock "refresh_cache-972fbea6-71af-4e33-9f9d-d82c46fcd564" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.compute.manager [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Instance network_info: |[{"id": "5e633cb4-e056-4752-9478-8e180c9c6869", "address": "fa:16:3e:9c:d7:a9", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e633cb4-e0", "ovs_interfaceid": "5e633cb4-e056-4752-9478-8e180c9c6869", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-9581f701-1465-41fb-96b8-5e21618e4864 req-82664cce-96af-44bb-b829-5c73cf48013c service nova] Acquired lock "refresh_cache-972fbea6-71af-4e33-9f9d-d82c46fcd564" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.network.neutron [req-9581f701-1465-41fb-96b8-5e21618e4864 req-82664cce-96af-44bb-b829-5c73cf48013c service nova] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Refreshing network info cache for port 5e633cb4-e056-4752-9478-8e180c9c6869 {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Start _get_guest_xml network_info=[{"id": "5e633cb4-e056-4752-9478-8e180c9c6869", "address": "fa:16:3e:9c:d7:a9", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e633cb4-e0", "ovs_interfaceid": "5e633cb4-e056-4752-9478-8e180c9c6869", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'size': 0, 'encryption_format': None, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'image_id': '4ac69ea5-e5d7-40c8-864e-0a164d78a727'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 16:19:36 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:19:36 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71605) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T16:00:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T15:59:03Z,direct_url=,disk_format='qcow2',id=4ac69ea5-e5d7-40c8-864e-0a164d78a727,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b448d7aed44e45efaa2904e3b0c4a06e',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T15:59:05Z,virtual_size=,visibility=), allow threads: True {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Flavor limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Image limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Flavor pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Image pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Got 1 possible topologies {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:19:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1879699440',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1879699440',id=22,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f978ad5201e412894f30daa8e2bd2e8',ramdisk_id='',reservation_id='r-neso7an2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-2108053043',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:19:35Z,user_data=None,user_id='8c79a05e12ae4aab91bc79d32b02ef46',uuid=972fbea6-71af-4e33-9f9d-d82c46fcd564,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5e633cb4-e056-4752-9478-8e180c9c6869", "address": "fa:16:3e:9c:d7:a9", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e633cb4-e0", "ovs_interfaceid": "5e633cb4-e056-4752-9478-8e180c9c6869", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71605) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Converting VIF {"id": "5e633cb4-e056-4752-9478-8e180c9c6869", "address": "fa:16:3e:9c:d7:a9", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e633cb4-e0", "ovs_interfaceid": "5e633cb4-e056-4752-9478-8e180c9c6869", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9c:d7:a9,bridge_name='br-int',has_traffic_filtering=True,id=5e633cb4-e056-4752-9478-8e180c9c6869,network=Network(110d8e20-360f-48b7-8b42-9ae9760d39b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e633cb4-e0') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.objects.instance [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lazy-loading 'pci_devices' on Instance uuid 972fbea6-71af-4e33-9f9d-d82c46fcd564 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] End _get_guest_xml xml= Apr 20 16:19:36 user nova-compute[71605]: 972fbea6-71af-4e33-9f9d-d82c46fcd564 Apr 20 16:19:36 user nova-compute[71605]: instance-00000016 Apr 20 16:19:36 user nova-compute[71605]: 131072 Apr 20 16:19:36 user nova-compute[71605]: 1 Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: tempest-ServerBootFromVolumeStableRescueTest-server-1879699440 Apr 20 16:19:36 user nova-compute[71605]: 2023-04-20 16:19:36 Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: 128 Apr 20 16:19:36 user nova-compute[71605]: 1 Apr 20 16:19:36 user nova-compute[71605]: 0 Apr 20 16:19:36 user nova-compute[71605]: 0 Apr 20 16:19:36 user nova-compute[71605]: 1 Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member Apr 20 16:19:36 user nova-compute[71605]: tempest-ServerBootFromVolumeStableRescueTest-2108053043 Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: OpenStack Foundation Apr 20 16:19:36 user nova-compute[71605]: OpenStack Nova Apr 20 16:19:36 user nova-compute[71605]: 0.0.0 Apr 20 16:19:36 user nova-compute[71605]: 972fbea6-71af-4e33-9f9d-d82c46fcd564 Apr 20 16:19:36 user nova-compute[71605]: 972fbea6-71af-4e33-9f9d-d82c46fcd564 Apr 20 16:19:36 user nova-compute[71605]: Virtual Machine Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: hvm Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Nehalem Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: /dev/urandom Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: Apr 20 16:19:36 user nova-compute[71605]: {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:19:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1879699440',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1879699440',id=22,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f978ad5201e412894f30daa8e2bd2e8',ramdisk_id='',reservation_id='r-neso7an2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-2108053043',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:19:35Z,user_data=None,user_id='8c79a05e12ae4aab91bc79d32b02ef46',uuid=972fbea6-71af-4e33-9f9d-d82c46fcd564,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5e633cb4-e056-4752-9478-8e180c9c6869", "address": "fa:16:3e:9c:d7:a9", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e633cb4-e0", "ovs_interfaceid": "5e633cb4-e056-4752-9478-8e180c9c6869", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Converting VIF {"id": "5e633cb4-e056-4752-9478-8e180c9c6869", "address": "fa:16:3e:9c:d7:a9", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e633cb4-e0", "ovs_interfaceid": "5e633cb4-e056-4752-9478-8e180c9c6869", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9c:d7:a9,bridge_name='br-int',has_traffic_filtering=True,id=5e633cb4-e056-4752-9478-8e180c9c6869,network=Network(110d8e20-360f-48b7-8b42-9ae9760d39b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e633cb4-e0') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG os_vif [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9c:d7:a9,bridge_name='br-int',has_traffic_filtering=True,id=5e633cb4-e056-4752-9478-8e180c9c6869,network=Network(110d8e20-360f-48b7-8b42-9ae9760d39b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e633cb4-e0') {{(pid=71605) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5e633cb4-e0, may_exist=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5e633cb4-e0, col_values=(('external_ids', {'iface-id': '5e633cb4-e056-4752-9478-8e180c9c6869', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9c:d7:a9', 'vm-uuid': '972fbea6-71af-4e33-9f9d-d82c46fcd564'}),)) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:19:36 user nova-compute[71605]: INFO os_vif [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9c:d7:a9,bridge_name='br-int',has_traffic_filtering=True,id=5e633cb4-e056-4752-9478-8e180c9c6869,network=Network(110d8e20-360f-48b7-8b42-9ae9760d39b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e633cb4-e0') Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] No BDM found with device name vda, not building metadata. {{(pid=71605) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 16:19:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] No VIF found with MAC fa:16:3e:9c:d7:a9, not building metadata {{(pid=71605) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 16:19:37 user nova-compute[71605]: DEBUG nova.network.neutron [req-9581f701-1465-41fb-96b8-5e21618e4864 req-82664cce-96af-44bb-b829-5c73cf48013c service nova] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Updated VIF entry in instance network info cache for port 5e633cb4-e056-4752-9478-8e180c9c6869. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:19:37 user nova-compute[71605]: DEBUG nova.network.neutron [req-9581f701-1465-41fb-96b8-5e21618e4864 req-82664cce-96af-44bb-b829-5c73cf48013c service nova] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Updating instance_info_cache with network_info: [{"id": "5e633cb4-e056-4752-9478-8e180c9c6869", "address": "fa:16:3e:9c:d7:a9", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e633cb4-e0", "ovs_interfaceid": "5e633cb4-e056-4752-9478-8e180c9c6869", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:19:37 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-9581f701-1465-41fb-96b8-5e21618e4864 req-82664cce-96af-44bb-b829-5c73cf48013c service nova] Releasing lock "refresh_cache-972fbea6-71af-4e33-9f9d-d82c46fcd564" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:19:38 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:19:38 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:19:38 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:19:38 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:19:38 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:19:38 user nova-compute[71605]: DEBUG nova.compute.manager [req-50bde0b3-d16b-40e2-a0c2-bf941664d992 req-d8b97ed7-2f15-429f-91c7-b658456ede8b service nova] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Received event network-vif-plugged-5e633cb4-e056-4752-9478-8e180c9c6869 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:19:38 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-50bde0b3-d16b-40e2-a0c2-bf941664d992 req-d8b97ed7-2f15-429f-91c7-b658456ede8b service nova] Acquiring lock "972fbea6-71af-4e33-9f9d-d82c46fcd564-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:19:38 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-50bde0b3-d16b-40e2-a0c2-bf941664d992 req-d8b97ed7-2f15-429f-91c7-b658456ede8b service nova] Lock "972fbea6-71af-4e33-9f9d-d82c46fcd564-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:19:38 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-50bde0b3-d16b-40e2-a0c2-bf941664d992 req-d8b97ed7-2f15-429f-91c7-b658456ede8b service nova] Lock "972fbea6-71af-4e33-9f9d-d82c46fcd564-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:19:38 user nova-compute[71605]: DEBUG nova.compute.manager [req-50bde0b3-d16b-40e2-a0c2-bf941664d992 req-d8b97ed7-2f15-429f-91c7-b658456ede8b service nova] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] No waiting events found dispatching network-vif-plugged-5e633cb4-e056-4752-9478-8e180c9c6869 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:19:38 user nova-compute[71605]: WARNING nova.compute.manager [req-50bde0b3-d16b-40e2-a0c2-bf941664d992 req-d8b97ed7-2f15-429f-91c7-b658456ede8b service nova] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Received unexpected event network-vif-plugged-5e633cb4-e056-4752-9478-8e180c9c6869 for instance with vm_state building and task_state spawning. Apr 20 16:19:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:19:40 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Resumed> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:19:40 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] VM Resumed (Lifecycle Event) Apr 20 16:19:40 user nova-compute[71605]: DEBUG nova.compute.manager [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Instance event wait completed in 0 seconds for {{(pid=71605) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 16:19:40 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Guest created on hypervisor {{(pid=71605) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 16:19:40 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Instance spawned successfully. Apr 20 16:19:40 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 16:19:40 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:19:40 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:19:40 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Found default for hw_cdrom_bus of ide {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:19:40 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Found default for hw_disk_bus of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:19:40 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Found default for hw_input_bus of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:19:40 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Found default for hw_pointer_model of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:19:40 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Found default for hw_video_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:19:40 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Found default for hw_vif_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:19:40 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:19:40 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Started> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:19:40 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] VM Started (Lifecycle Event) Apr 20 16:19:40 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:19:40 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:19:40 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:19:40 user nova-compute[71605]: INFO nova.compute.manager [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Took 5.45 seconds to spawn the instance on the hypervisor. Apr 20 16:19:40 user nova-compute[71605]: DEBUG nova.compute.manager [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:19:40 user nova-compute[71605]: INFO nova.compute.manager [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Took 5.99 seconds to build instance. Apr 20 16:19:40 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-cd3b40b9-a1ae-406e-b8de-c528b415cdd3 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "972fbea6-71af-4e33-9f9d-d82c46fcd564" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.093s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:19:41 user nova-compute[71605]: DEBUG nova.compute.manager [req-6a68abe9-d6ed-4e16-8854-9eb608cc7d3c req-7e484e40-993b-43dd-95e5-2004dbce9e6a service nova] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Received event network-vif-plugged-5e633cb4-e056-4752-9478-8e180c9c6869 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:19:41 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-6a68abe9-d6ed-4e16-8854-9eb608cc7d3c req-7e484e40-993b-43dd-95e5-2004dbce9e6a service nova] Acquiring lock "972fbea6-71af-4e33-9f9d-d82c46fcd564-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:19:41 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-6a68abe9-d6ed-4e16-8854-9eb608cc7d3c req-7e484e40-993b-43dd-95e5-2004dbce9e6a service nova] Lock "972fbea6-71af-4e33-9f9d-d82c46fcd564-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:19:41 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-6a68abe9-d6ed-4e16-8854-9eb608cc7d3c req-7e484e40-993b-43dd-95e5-2004dbce9e6a service nova] Lock "972fbea6-71af-4e33-9f9d-d82c46fcd564-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:19:41 user nova-compute[71605]: DEBUG nova.compute.manager [req-6a68abe9-d6ed-4e16-8854-9eb608cc7d3c req-7e484e40-993b-43dd-95e5-2004dbce9e6a service nova] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] No waiting events found dispatching network-vif-plugged-5e633cb4-e056-4752-9478-8e180c9c6869 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:19:41 user nova-compute[71605]: WARNING nova.compute.manager [req-6a68abe9-d6ed-4e16-8854-9eb608cc7d3c req-7e484e40-993b-43dd-95e5-2004dbce9e6a service nova] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Received unexpected event network-vif-plugged-5e633cb4-e056-4752-9478-8e180c9c6869 for instance with vm_state active and task_state None. Apr 20 16:19:41 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:19:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:19:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:19:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:19:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:19:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:19:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:19:49 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:19:50 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:19:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:19:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:19:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:19:50 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:19:50 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:19:50 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:19:50 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:19:50 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:19:50 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:19:50 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:19:50 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:19:50 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:19:51 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:19:51 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:19:51 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=8959MB free_disk=26.310203552246094GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:19:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:19:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:19:51 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance cabd55bf-46c4-41be-942d-b6563f6b2778 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:19:51 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance 972fbea6-71af-4e33-9f9d-d82c46fcd564 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:19:51 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:19:51 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:19:51 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Refreshing inventories for resource provider 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 20 16:19:51 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Updating ProviderTree inventory for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 20 16:19:51 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Updating inventory in ProviderTree for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 20 16:19:51 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Refreshing aggregate associations for resource provider 00e9f769-1a1c-4f1e-80e4-b19657803102, aggregates: None {{(pid=71605) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 20 16:19:51 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Refreshing trait associations for resource provider 00e9f769-1a1c-4f1e-80e4-b19657803102, traits: COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_MMX,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI {{(pid=71605) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 20 16:19:51 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:19:51 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:19:51 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:19:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.448s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:19:51 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:19:53 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:19:53 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:19:54 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:19:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:19:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Rebuilding the list of instances to heal {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 16:19:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "refresh_cache-cabd55bf-46c4-41be-942d-b6563f6b2778" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:19:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquired lock "refresh_cache-cabd55bf-46c4-41be-942d-b6563f6b2778" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:19:54 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Forcefully refreshing network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 16:19:54 user nova-compute[71605]: DEBUG nova.objects.instance [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lazy-loading 'info_cache' on Instance uuid cabd55bf-46c4-41be-942d-b6563f6b2778 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:19:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:19:54 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Updating instance_info_cache with network_info: [{"id": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "address": "fa:16:3e:5f:14:92", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap51a9dc4c-8c", "ovs_interfaceid": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:19:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Releasing lock "refresh_cache-cabd55bf-46c4-41be-942d-b6563f6b2778" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:19:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Updated the network info_cache for instance {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 16:19:54 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:19:54 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:19:55 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:19:56 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:19:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:19:58 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:19:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:20:00 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:20:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:20:04 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:20:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:20:09 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:20:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:20:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:20:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:20:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:20:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:20:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:20:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:20:21 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:20:21 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:20:21 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:20:21 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:20:21 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:20:21 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:20:24 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:20:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:20:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:20:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:20:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:20:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:20:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:20:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:20:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:20:41 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:20:41 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:20:41 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:20:41 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:20:41 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:20:41 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:20:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:20:50 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:20:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:20:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:20:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:20:50 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:20:50 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:20:50 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:20:50 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:20:50 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:20:50 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:20:50 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:20:50 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:20:50 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:20:51 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:20:51 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:20:51 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=9040MB free_disk=26.290019989013672GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:20:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:20:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:20:51 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance cabd55bf-46c4-41be-942d-b6563f6b2778 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:20:51 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance 972fbea6-71af-4e33-9f9d-d82c46fcd564 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:20:51 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:20:51 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:20:51 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:20:51 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:20:51 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:20:51 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.198s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:20:51 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:20:54 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:20:54 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:20:55 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:20:55 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:20:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "refresh_cache-972fbea6-71af-4e33-9f9d-d82c46fcd564" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:20:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquired lock "refresh_cache-972fbea6-71af-4e33-9f9d-d82c46fcd564" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:20:55 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Forcefully refreshing network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 16:20:55 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Updating instance_info_cache with network_info: [{"id": "5e633cb4-e056-4752-9478-8e180c9c6869", "address": "fa:16:3e:9c:d7:a9", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e633cb4-e0", "ovs_interfaceid": "5e633cb4-e056-4752-9478-8e180c9c6869", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:20:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Releasing lock "refresh_cache-972fbea6-71af-4e33-9f9d-d82c46fcd564" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:20:55 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Updated the network info_cache for instance {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 16:20:55 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:20:56 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:20:56 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:20:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:20:57 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:20:58 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:20:58 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:20:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:21:01 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:21:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:21:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:21:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:21:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:21:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:21:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:21:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:21:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:21:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:21:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:21:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:21:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:21:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:21:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:21:21 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:21:25 user nova-compute[71605]: DEBUG nova.compute.manager [None req-90016ebd-397c-4a07-bbef-204473bdf394 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:21:25 user nova-compute[71605]: INFO nova.compute.manager [None req-90016ebd-397c-4a07-bbef-204473bdf394 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] instance snapshotting Apr 20 16:21:25 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-90016ebd-397c-4a07-bbef-204473bdf394 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Beginning live snapshot process Apr 20 16:21:25 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-90016ebd-397c-4a07-bbef-204473bdf394 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/disk --force-share --output=json -f qcow2 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:21:25 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-90016ebd-397c-4a07-bbef-204473bdf394 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/disk --force-share --output=json -f qcow2" returned: 0 in 0.137s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:21:25 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-90016ebd-397c-4a07-bbef-204473bdf394 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/disk --force-share --output=json -f qcow2 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:21:25 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-90016ebd-397c-4a07-bbef-204473bdf394 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/disk --force-share --output=json -f qcow2" returned: 0 in 0.132s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:21:25 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-90016ebd-397c-4a07-bbef-204473bdf394 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:21:26 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-90016ebd-397c-4a07-bbef-204473bdf394 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890 --force-share --output=json" returned: 0 in 0.131s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:21:26 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-90016ebd-397c-4a07-bbef-204473bdf394 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmp7is_kv2r/2ca0dfbab07a4f97a205477997440669.delta 1073741824 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:21:26 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-90016ebd-397c-4a07-bbef-204473bdf394 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/4030659dc9e6940e4f224066d06e3784b1229890,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmp7is_kv2r/2ca0dfbab07a4f97a205477997440669.delta 1073741824" returned: 0 in 0.048s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:21:26 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-90016ebd-397c-4a07-bbef-204473bdf394 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Quiescing instance not available: QEMU guest agent is not enabled. Apr 20 16:21:26 user nova-compute[71605]: DEBUG nova.virt.libvirt.guest [None req-90016ebd-397c-4a07-bbef-204473bdf394 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] COPY block job progress, current cursor: 0 final cursor: 43778048 {{(pid=71605) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 20 16:21:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:21:27 user nova-compute[71605]: DEBUG nova.virt.libvirt.guest [None req-90016ebd-397c-4a07-bbef-204473bdf394 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] COPY block job progress, current cursor: 43778048 final cursor: 43778048 {{(pid=71605) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 20 16:21:27 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-90016ebd-397c-4a07-bbef-204473bdf394 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Skipping quiescing instance: QEMU guest agent is not enabled. Apr 20 16:21:27 user nova-compute[71605]: DEBUG nova.privsep.utils [None req-90016ebd-397c-4a07-bbef-204473bdf394 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71605) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 20 16:21:27 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-90016ebd-397c-4a07-bbef-204473bdf394 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmp7is_kv2r/2ca0dfbab07a4f97a205477997440669.delta /opt/stack/data/nova/instances/snapshots/tmp7is_kv2r/2ca0dfbab07a4f97a205477997440669 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:21:27 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-90016ebd-397c-4a07-bbef-204473bdf394 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmp7is_kv2r/2ca0dfbab07a4f97a205477997440669.delta /opt/stack/data/nova/instances/snapshots/tmp7is_kv2r/2ca0dfbab07a4f97a205477997440669" returned: 0 in 0.316s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:21:27 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-90016ebd-397c-4a07-bbef-204473bdf394 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Snapshot extracted, beginning image upload Apr 20 16:21:29 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-90016ebd-397c-4a07-bbef-204473bdf394 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Snapshot image upload complete Apr 20 16:21:29 user nova-compute[71605]: INFO nova.compute.manager [None req-90016ebd-397c-4a07-bbef-204473bdf394 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Took 4.28 seconds to snapshot the instance on the hypervisor. Apr 20 16:21:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:21:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:21:41 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:21:41 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:21:41 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:21:41 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:21:41 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:21:41 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:21:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:21:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:21:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:21:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:21:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:21:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:21:51 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:21:51 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:21:51 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:21:51 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:21:51 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:21:51 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:21:52 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:21:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:21:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:21:52 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:21:52 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:21:52 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:21:52 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:21:52 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:21:52 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:21:52 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:21:52 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:21:52 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:21:52 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:21:53 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:21:53 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:21:53 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=9057MB free_disk=26.253276824951172GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:21:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:21:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:21:53 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance cabd55bf-46c4-41be-942d-b6563f6b2778 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:21:53 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance 972fbea6-71af-4e33-9f9d-d82c46fcd564 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:21:53 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:21:53 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:21:53 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:21:53 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:21:53 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:21:53 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.213s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:21:54 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:21:54 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:21:56 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:21:56 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:21:56 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Rebuilding the list of instances to heal {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 16:21:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "refresh_cache-cabd55bf-46c4-41be-942d-b6563f6b2778" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:21:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquired lock "refresh_cache-cabd55bf-46c4-41be-942d-b6563f6b2778" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:21:56 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Forcefully refreshing network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 16:21:56 user nova-compute[71605]: DEBUG nova.objects.instance [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lazy-loading 'info_cache' on Instance uuid cabd55bf-46c4-41be-942d-b6563f6b2778 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:21:56 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Updating instance_info_cache with network_info: [{"id": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "address": "fa:16:3e:5f:14:92", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap51a9dc4c-8c", "ovs_interfaceid": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:21:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Releasing lock "refresh_cache-cabd55bf-46c4-41be-942d-b6563f6b2778" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:21:56 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Updated the network info_cache for instance {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 16:21:56 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:21:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:21:57 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:21:57 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:21:58 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:22:00 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:22:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:22:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:22:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:22:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:22:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:22:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:22:03 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:22:04 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:22:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:22:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:22:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:22:21 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:22:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:22:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:22:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:22:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:22:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:22:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:22:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:22:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquiring lock "b8d7db8a-bebc-4924-8151-722ff9eb177a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:22:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "b8d7db8a-bebc-4924-8151-722ff9eb177a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:22:32 user nova-compute[71605]: DEBUG nova.compute.manager [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Starting instance... {{(pid=71605) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 16:22:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:22:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:22:32 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71605) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 16:22:32 user nova-compute[71605]: INFO nova.compute.claims [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Claim successful on node user Apr 20 16:22:32 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:22:32 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:22:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.236s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:22:32 user nova-compute[71605]: DEBUG nova.compute.manager [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Start building networks asynchronously for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 16:22:32 user nova-compute[71605]: DEBUG nova.compute.manager [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Allocating IP information in the background. {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 16:22:32 user nova-compute[71605]: DEBUG nova.network.neutron [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] allocate_for_instance() {{(pid=71605) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 16:22:32 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 16:22:32 user nova-compute[71605]: DEBUG nova.compute.manager [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Start building block device mappings for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 16:22:32 user nova-compute[71605]: INFO nova.virt.block_device [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Booting with volume-backed-image 4ac69ea5-e5d7-40c8-864e-0a164d78a727 at /dev/vda Apr 20 16:22:32 user nova-compute[71605]: DEBUG nova.policy [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8c79a05e12ae4aab91bc79d32b02ef46', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8f978ad5201e412894f30daa8e2bd2e8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71605) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 16:22:33 user nova-compute[71605]: WARNING nova.compute.manager [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Volume id: c41d850a-2fa4-4555-b101-a5fd4f7755cf finished being created but its status is error. Apr 20 16:22:33 user nova-compute[71605]: ERROR nova.compute.manager [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Instance failed block device setup: nova.exception.VolumeNotCreated: Volume c41d850a-2fa4-4555-b101-a5fd4f7755cf did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 20 16:22:33 user nova-compute[71605]: ERROR nova.compute.manager [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Traceback (most recent call last): Apr 20 16:22:33 user nova-compute[71605]: ERROR nova.compute.manager [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] File "/opt/stack/nova/nova/compute/manager.py", line 2175, in _prep_block_device Apr 20 16:22:33 user nova-compute[71605]: ERROR nova.compute.manager [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] driver_block_device.attach_block_devices( Apr 20 16:22:33 user nova-compute[71605]: ERROR nova.compute.manager [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] File "/opt/stack/nova/nova/virt/block_device.py", line 936, in attach_block_devices Apr 20 16:22:33 user nova-compute[71605]: ERROR nova.compute.manager [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] _log_and_attach(device) Apr 20 16:22:33 user nova-compute[71605]: ERROR nova.compute.manager [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] File "/opt/stack/nova/nova/virt/block_device.py", line 933, in _log_and_attach Apr 20 16:22:33 user nova-compute[71605]: ERROR nova.compute.manager [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] bdm.attach(*attach_args, **attach_kwargs) Apr 20 16:22:33 user nova-compute[71605]: ERROR nova.compute.manager [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] File "/opt/stack/nova/nova/virt/block_device.py", line 831, in attach Apr 20 16:22:33 user nova-compute[71605]: ERROR nova.compute.manager [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] self.volume_id, self.attachment_id = self._create_volume( Apr 20 16:22:33 user nova-compute[71605]: ERROR nova.compute.manager [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] File "/opt/stack/nova/nova/virt/block_device.py", line 435, in _create_volume Apr 20 16:22:33 user nova-compute[71605]: ERROR nova.compute.manager [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] self._call_wait_func(context, wait_func, volume_api, vol['id']) Apr 20 16:22:33 user nova-compute[71605]: ERROR nova.compute.manager [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] File "/opt/stack/nova/nova/virt/block_device.py", line 785, in _call_wait_func Apr 20 16:22:33 user nova-compute[71605]: ERROR nova.compute.manager [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] with excutils.save_and_reraise_exception(): Apr 20 16:22:33 user nova-compute[71605]: ERROR nova.compute.manager [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ Apr 20 16:22:33 user nova-compute[71605]: ERROR nova.compute.manager [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] self.force_reraise() Apr 20 16:22:33 user nova-compute[71605]: ERROR nova.compute.manager [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise Apr 20 16:22:33 user nova-compute[71605]: ERROR nova.compute.manager [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] raise self.value Apr 20 16:22:33 user nova-compute[71605]: ERROR nova.compute.manager [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] File "/opt/stack/nova/nova/virt/block_device.py", line 783, in _call_wait_func Apr 20 16:22:33 user nova-compute[71605]: ERROR nova.compute.manager [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] wait_func(context, volume_id) Apr 20 16:22:33 user nova-compute[71605]: ERROR nova.compute.manager [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] File "/opt/stack/nova/nova/compute/manager.py", line 1792, in _await_block_device_map_created Apr 20 16:22:33 user nova-compute[71605]: ERROR nova.compute.manager [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] raise exception.VolumeNotCreated(volume_id=vol_id, Apr 20 16:22:33 user nova-compute[71605]: ERROR nova.compute.manager [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] nova.exception.VolumeNotCreated: Volume c41d850a-2fa4-4555-b101-a5fd4f7755cf did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 20 16:22:33 user nova-compute[71605]: ERROR nova.compute.manager [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Apr 20 16:22:33 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:22:33 user nova-compute[71605]: DEBUG nova.network.neutron [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Successfully created port: 2ff3bfa7-7af2-4b5c-bce9-8fafca0ef930 {{(pid=71605) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG nova.network.neutron [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Successfully updated port: 2ff3bfa7-7af2-4b5c-bce9-8fafca0ef930 {{(pid=71605) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquiring lock "refresh_cache-b8d7db8a-bebc-4924-8151-722ff9eb177a" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquired lock "refresh_cache-b8d7db8a-bebc-4924-8151-722ff9eb177a" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG nova.network.neutron [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Building network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG nova.network.neutron [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Instance cache missing network info. {{(pid=71605) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG nova.compute.manager [req-5b20e150-c5a7-4ae0-bf1c-e1ddad398313 req-c3f88d4e-8117-4fa5-8f16-96ead2549faf service nova] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Received event network-changed-2ff3bfa7-7af2-4b5c-bce9-8fafca0ef930 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG nova.compute.manager [req-5b20e150-c5a7-4ae0-bf1c-e1ddad398313 req-c3f88d4e-8117-4fa5-8f16-96ead2549faf service nova] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Refreshing instance network info cache due to event network-changed-2ff3bfa7-7af2-4b5c-bce9-8fafca0ef930. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-5b20e150-c5a7-4ae0-bf1c-e1ddad398313 req-c3f88d4e-8117-4fa5-8f16-96ead2549faf service nova] Acquiring lock "refresh_cache-b8d7db8a-bebc-4924-8151-722ff9eb177a" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG nova.network.neutron [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Updating instance_info_cache with network_info: [{"id": "2ff3bfa7-7af2-4b5c-bce9-8fafca0ef930", "address": "fa:16:3e:d6:62:e9", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ff3bfa7-7a", "ovs_interfaceid": "2ff3bfa7-7af2-4b5c-bce9-8fafca0ef930", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Releasing lock "refresh_cache-b8d7db8a-bebc-4924-8151-722ff9eb177a" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG nova.compute.manager [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Instance network_info: |[{"id": "2ff3bfa7-7af2-4b5c-bce9-8fafca0ef930", "address": "fa:16:3e:d6:62:e9", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ff3bfa7-7a", "ovs_interfaceid": "2ff3bfa7-7af2-4b5c-bce9-8fafca0ef930", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-5b20e150-c5a7-4ae0-bf1c-e1ddad398313 req-c3f88d4e-8117-4fa5-8f16-96ead2549faf service nova] Acquired lock "refresh_cache-b8d7db8a-bebc-4924-8151-722ff9eb177a" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG nova.network.neutron [req-5b20e150-c5a7-4ae0-bf1c-e1ddad398313 req-c3f88d4e-8117-4fa5-8f16-96ead2549faf service nova] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Refreshing network info cache for port 2ff3bfa7-7af2-4b5c-bce9-8fafca0ef930 {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG nova.compute.claims [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Aborting claim: {{(pid=71605) abort /opt/stack/nova/nova/compute/claims.py:84}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.223s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG nova.compute.manager [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Build of instance b8d7db8a-bebc-4924-8151-722ff9eb177a aborted: Volume c41d850a-2fa4-4555-b101-a5fd4f7755cf did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. {{(pid=71605) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2636}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG nova.compute.utils [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Build of instance b8d7db8a-bebc-4924-8151-722ff9eb177a aborted: Volume c41d850a-2fa4-4555-b101-a5fd4f7755cf did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. {{(pid=71605) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} Apr 20 16:22:34 user nova-compute[71605]: ERROR nova.compute.manager [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Build of instance b8d7db8a-bebc-4924-8151-722ff9eb177a aborted: Volume c41d850a-2fa4-4555-b101-a5fd4f7755cf did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error.: nova.exception.BuildAbortException: Build of instance b8d7db8a-bebc-4924-8151-722ff9eb177a aborted: Volume c41d850a-2fa4-4555-b101-a5fd4f7755cf did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 20 16:22:34 user nova-compute[71605]: DEBUG nova.compute.manager [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Unplugging VIFs for instance {{(pid=71605) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:22:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-301528259',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-301528259',id=23,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8f978ad5201e412894f30daa8e2bd2e8',ramdisk_id='',reservation_id='r-fe7bo8j5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-2108053043',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member'},tags=TagList,task_state='block_device_mapping',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:22:33Z,user_data=None,user_id='8c79a05e12ae4aab91bc79d32b02ef46',uuid=b8d7db8a-bebc-4924-8151-722ff9eb177a,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2ff3bfa7-7af2-4b5c-bce9-8fafca0ef930", "address": "fa:16:3e:d6:62:e9", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ff3bfa7-7a", "ovs_interfaceid": "2ff3bfa7-7af2-4b5c-bce9-8fafca0ef930", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Converting VIF {"id": "2ff3bfa7-7af2-4b5c-bce9-8fafca0ef930", "address": "fa:16:3e:d6:62:e9", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ff3bfa7-7a", "ovs_interfaceid": "2ff3bfa7-7af2-4b5c-bce9-8fafca0ef930", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:62:e9,bridge_name='br-int',has_traffic_filtering=True,id=2ff3bfa7-7af2-4b5c-bce9-8fafca0ef930,network=Network(110d8e20-360f-48b7-8b42-9ae9760d39b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ff3bfa7-7a') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG os_vif [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:62:e9,bridge_name='br-int',has_traffic_filtering=True,id=2ff3bfa7-7af2-4b5c-bce9-8fafca0ef930,network=Network(110d8e20-360f-48b7-8b42-9ae9760d39b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ff3bfa7-7a') {{(pid=71605) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2ff3bfa7-7a, bridge=br-int, if_exists=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 16:22:34 user nova-compute[71605]: INFO os_vif [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:62:e9,bridge_name='br-int',has_traffic_filtering=True,id=2ff3bfa7-7af2-4b5c-bce9-8fafca0ef930,network=Network(110d8e20-360f-48b7-8b42-9ae9760d39b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ff3bfa7-7a') Apr 20 16:22:34 user nova-compute[71605]: DEBUG nova.compute.manager [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Unplugged VIFs for instance {{(pid=71605) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG nova.compute.manager [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Deallocating network for instance {{(pid=71605) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG nova.network.neutron [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] deallocate_for_instance() {{(pid=71605) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG nova.network.neutron [req-5b20e150-c5a7-4ae0-bf1c-e1ddad398313 req-c3f88d4e-8117-4fa5-8f16-96ead2549faf service nova] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Updated VIF entry in instance network info cache for port 2ff3bfa7-7af2-4b5c-bce9-8fafca0ef930. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG nova.network.neutron [req-5b20e150-c5a7-4ae0-bf1c-e1ddad398313 req-c3f88d4e-8117-4fa5-8f16-96ead2549faf service nova] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Updating instance_info_cache with network_info: [{"id": "2ff3bfa7-7af2-4b5c-bce9-8fafca0ef930", "address": "fa:16:3e:d6:62:e9", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ff3bfa7-7a", "ovs_interfaceid": "2ff3bfa7-7af2-4b5c-bce9-8fafca0ef930", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:22:34 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-5b20e150-c5a7-4ae0-bf1c-e1ddad398313 req-c3f88d4e-8117-4fa5-8f16-96ead2549faf service nova] Releasing lock "refresh_cache-b8d7db8a-bebc-4924-8151-722ff9eb177a" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:22:35 user nova-compute[71605]: DEBUG nova.network.neutron [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:22:35 user nova-compute[71605]: INFO nova.compute.manager [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: b8d7db8a-bebc-4924-8151-722ff9eb177a] Took 0.87 seconds to deallocate network for instance. Apr 20 16:22:35 user nova-compute[71605]: INFO nova.scheduler.client.report [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Deleted allocations for instance b8d7db8a-bebc-4924-8151-722ff9eb177a Apr 20 16:22:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-6ef37d6c-8975-4862-a108-ee81ddfce279 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "b8d7db8a-bebc-4924-8151-722ff9eb177a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 3.667s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:22:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:22:41 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:22:41 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:22:41 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:22:41 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:22:41 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:22:41 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:22:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:22:51 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:22:54 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:22:54 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:22:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:22:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:22:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:22:54 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:22:54 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:22:54 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:22:54 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:22:54 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:22:54 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:22:54 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:22:54 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:22:54 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:22:55 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:22:55 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:22:55 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=9041MB free_disk=26.252262115478516GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:22:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:22:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:22:55 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance cabd55bf-46c4-41be-942d-b6563f6b2778 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:22:55 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance 972fbea6-71af-4e33-9f9d-d82c46fcd564 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:22:55 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:22:55 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:22:55 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:22:55 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:22:55 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:22:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.261s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:22:56 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:22:56 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:22:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "refresh_cache-972fbea6-71af-4e33-9f9d-d82c46fcd564" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:22:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquired lock "refresh_cache-972fbea6-71af-4e33-9f9d-d82c46fcd564" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:22:56 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Forcefully refreshing network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 16:22:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:22:57 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Updating instance_info_cache with network_info: [{"id": "5e633cb4-e056-4752-9478-8e180c9c6869", "address": "fa:16:3e:9c:d7:a9", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e633cb4-e0", "ovs_interfaceid": "5e633cb4-e056-4752-9478-8e180c9c6869", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:22:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Releasing lock "refresh_cache-972fbea6-71af-4e33-9f9d-d82c46fcd564" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:22:57 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Updated the network info_cache for instance {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 16:22:57 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:22:57 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:22:59 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:22:59 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:23:00 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:23:00 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:23:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:23:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:23:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:23:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:23:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:23:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:23:02 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:23:04 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:23:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:23:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:23:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:23:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:23:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:23:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:23:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:23:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:23:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:23:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:23:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:23:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:23:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:23:21 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:23:22 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-06da7dda-4424-4842-84d7-1fef3835b790 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquiring lock "972fbea6-71af-4e33-9f9d-d82c46fcd564" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:23:22 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-06da7dda-4424-4842-84d7-1fef3835b790 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "972fbea6-71af-4e33-9f9d-d82c46fcd564" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:23:22 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-06da7dda-4424-4842-84d7-1fef3835b790 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquiring lock "972fbea6-71af-4e33-9f9d-d82c46fcd564-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:23:22 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-06da7dda-4424-4842-84d7-1fef3835b790 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "972fbea6-71af-4e33-9f9d-d82c46fcd564-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:23:22 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-06da7dda-4424-4842-84d7-1fef3835b790 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "972fbea6-71af-4e33-9f9d-d82c46fcd564-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:23:22 user nova-compute[71605]: INFO nova.compute.manager [None req-06da7dda-4424-4842-84d7-1fef3835b790 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Terminating instance Apr 20 16:23:22 user nova-compute[71605]: DEBUG nova.compute.manager [None req-06da7dda-4424-4842-84d7-1fef3835b790 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Start destroying the instance on the hypervisor. {{(pid=71605) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 16:23:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:23:22 user nova-compute[71605]: DEBUG nova.compute.manager [req-1ae5ead7-112a-4a24-bcc3-e7de5df532cc req-8aff5709-3599-49c1-b6ad-bdf7e1322b00 service nova] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Received event network-vif-unplugged-5e633cb4-e056-4752-9478-8e180c9c6869 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:23:22 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-1ae5ead7-112a-4a24-bcc3-e7de5df532cc req-8aff5709-3599-49c1-b6ad-bdf7e1322b00 service nova] Acquiring lock "972fbea6-71af-4e33-9f9d-d82c46fcd564-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:23:22 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-1ae5ead7-112a-4a24-bcc3-e7de5df532cc req-8aff5709-3599-49c1-b6ad-bdf7e1322b00 service nova] Lock "972fbea6-71af-4e33-9f9d-d82c46fcd564-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:23:22 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-1ae5ead7-112a-4a24-bcc3-e7de5df532cc req-8aff5709-3599-49c1-b6ad-bdf7e1322b00 service nova] Lock "972fbea6-71af-4e33-9f9d-d82c46fcd564-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:23:22 user nova-compute[71605]: DEBUG nova.compute.manager [req-1ae5ead7-112a-4a24-bcc3-e7de5df532cc req-8aff5709-3599-49c1-b6ad-bdf7e1322b00 service nova] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] No waiting events found dispatching network-vif-unplugged-5e633cb4-e056-4752-9478-8e180c9c6869 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:23:22 user nova-compute[71605]: DEBUG nova.compute.manager [req-1ae5ead7-112a-4a24-bcc3-e7de5df532cc req-8aff5709-3599-49c1-b6ad-bdf7e1322b00 service nova] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Received event network-vif-unplugged-5e633cb4-e056-4752-9478-8e180c9c6869 for instance with task_state deleting. {{(pid=71605) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 16:23:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:23:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:23:23 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Instance destroyed successfully. Apr 20 16:23:23 user nova-compute[71605]: DEBUG nova.objects.instance [None req-06da7dda-4424-4842-84d7-1fef3835b790 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lazy-loading 'resources' on Instance uuid 972fbea6-71af-4e33-9f9d-d82c46fcd564 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:23:23 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-06da7dda-4424-4842-84d7-1fef3835b790 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:19:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1879699440',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1879699440',id=22,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-20T16:19:40Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='8f978ad5201e412894f30daa8e2bd2e8',ramdisk_id='',reservation_id='r-neso7an2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-2108053043',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T16:21:30Z,user_data=None,user_id='8c79a05e12ae4aab91bc79d32b02ef46',uuid=972fbea6-71af-4e33-9f9d-d82c46fcd564,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5e633cb4-e056-4752-9478-8e180c9c6869", "address": "fa:16:3e:9c:d7:a9", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e633cb4-e0", "ovs_interfaceid": "5e633cb4-e056-4752-9478-8e180c9c6869", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 16:23:23 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-06da7dda-4424-4842-84d7-1fef3835b790 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Converting VIF {"id": "5e633cb4-e056-4752-9478-8e180c9c6869", "address": "fa:16:3e:9c:d7:a9", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e633cb4-e0", "ovs_interfaceid": "5e633cb4-e056-4752-9478-8e180c9c6869", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:23:23 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-06da7dda-4424-4842-84d7-1fef3835b790 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9c:d7:a9,bridge_name='br-int',has_traffic_filtering=True,id=5e633cb4-e056-4752-9478-8e180c9c6869,network=Network(110d8e20-360f-48b7-8b42-9ae9760d39b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e633cb4-e0') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:23:23 user nova-compute[71605]: DEBUG os_vif [None req-06da7dda-4424-4842-84d7-1fef3835b790 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9c:d7:a9,bridge_name='br-int',has_traffic_filtering=True,id=5e633cb4-e056-4752-9478-8e180c9c6869,network=Network(110d8e20-360f-48b7-8b42-9ae9760d39b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e633cb4-e0') {{(pid=71605) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 16:23:23 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:23:23 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5e633cb4-e0, bridge=br-int, if_exists=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:23:23 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:23:23 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:23:23 user nova-compute[71605]: INFO os_vif [None req-06da7dda-4424-4842-84d7-1fef3835b790 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9c:d7:a9,bridge_name='br-int',has_traffic_filtering=True,id=5e633cb4-e056-4752-9478-8e180c9c6869,network=Network(110d8e20-360f-48b7-8b42-9ae9760d39b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e633cb4-e0') Apr 20 16:23:23 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-06da7dda-4424-4842-84d7-1fef3835b790 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Deleting instance files /opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564_del Apr 20 16:23:23 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-06da7dda-4424-4842-84d7-1fef3835b790 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Deletion of /opt/stack/data/nova/instances/972fbea6-71af-4e33-9f9d-d82c46fcd564_del complete Apr 20 16:23:23 user nova-compute[71605]: INFO nova.compute.manager [None req-06da7dda-4424-4842-84d7-1fef3835b790 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Took 0.65 seconds to destroy the instance on the hypervisor. Apr 20 16:23:23 user nova-compute[71605]: DEBUG oslo.service.loopingcall [None req-06da7dda-4424-4842-84d7-1fef3835b790 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71605) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 16:23:23 user nova-compute[71605]: DEBUG nova.compute.manager [-] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Deallocating network for instance {{(pid=71605) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 16:23:23 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] deallocate_for_instance() {{(pid=71605) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 16:23:23 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:23:23 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Took 0.70 seconds to deallocate network for instance. Apr 20 16:23:24 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-06da7dda-4424-4842-84d7-1fef3835b790 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:23:24 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-06da7dda-4424-4842-84d7-1fef3835b790 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:23:24 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-06da7dda-4424-4842-84d7-1fef3835b790 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:23:24 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-06da7dda-4424-4842-84d7-1fef3835b790 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:23:24 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-06da7dda-4424-4842-84d7-1fef3835b790 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.139s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:23:24 user nova-compute[71605]: INFO nova.scheduler.client.report [None req-06da7dda-4424-4842-84d7-1fef3835b790 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Deleted allocations for instance 972fbea6-71af-4e33-9f9d-d82c46fcd564 Apr 20 16:23:24 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-06da7dda-4424-4842-84d7-1fef3835b790 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "972fbea6-71af-4e33-9f9d-d82c46fcd564" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.659s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:23:24 user nova-compute[71605]: DEBUG nova.compute.manager [req-0edeaf07-d640-488b-9981-811fcfdfd4c3 req-1fa9c4f8-5032-4e7b-aaea-38617a755a81 service nova] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Received event network-vif-plugged-5e633cb4-e056-4752-9478-8e180c9c6869 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:23:24 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-0edeaf07-d640-488b-9981-811fcfdfd4c3 req-1fa9c4f8-5032-4e7b-aaea-38617a755a81 service nova] Acquiring lock "972fbea6-71af-4e33-9f9d-d82c46fcd564-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:23:24 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-0edeaf07-d640-488b-9981-811fcfdfd4c3 req-1fa9c4f8-5032-4e7b-aaea-38617a755a81 service nova] Lock "972fbea6-71af-4e33-9f9d-d82c46fcd564-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:23:24 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-0edeaf07-d640-488b-9981-811fcfdfd4c3 req-1fa9c4f8-5032-4e7b-aaea-38617a755a81 service nova] Lock "972fbea6-71af-4e33-9f9d-d82c46fcd564-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:23:24 user nova-compute[71605]: DEBUG nova.compute.manager [req-0edeaf07-d640-488b-9981-811fcfdfd4c3 req-1fa9c4f8-5032-4e7b-aaea-38617a755a81 service nova] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] No waiting events found dispatching network-vif-plugged-5e633cb4-e056-4752-9478-8e180c9c6869 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:23:24 user nova-compute[71605]: WARNING nova.compute.manager [req-0edeaf07-d640-488b-9981-811fcfdfd4c3 req-1fa9c4f8-5032-4e7b-aaea-38617a755a81 service nova] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Received unexpected event network-vif-plugged-5e633cb4-e056-4752-9478-8e180c9c6869 for instance with vm_state deleted and task_state None. Apr 20 16:23:24 user nova-compute[71605]: DEBUG nova.compute.manager [req-0edeaf07-d640-488b-9981-811fcfdfd4c3 req-1fa9c4f8-5032-4e7b-aaea-38617a755a81 service nova] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Received event network-vif-deleted-5e633cb4-e056-4752-9478-8e180c9c6869 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:23:28 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:23:29 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:23:33 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:23:38 user nova-compute[71605]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:23:38 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] VM Stopped (Lifecycle Event) Apr 20 16:23:38 user nova-compute[71605]: DEBUG nova.compute.manager [None req-a0a272fa-c153-4c55-9b01-9d574e25d84d None None] [instance: 972fbea6-71af-4e33-9f9d-d82c46fcd564] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:23:38 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:23:43 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:23:48 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:23:53 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:23:54 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:23:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:23:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:23:54 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:23:54 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:23:54 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:23:54 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:23:54 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:23:54 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:23:55 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:23:55 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:23:55 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=9103MB free_disk=26.30703353881836GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:23:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:23:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:23:55 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance cabd55bf-46c4-41be-942d-b6563f6b2778 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:23:55 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:23:55 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:23:55 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:23:55 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:23:55 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:23:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.189s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:23:56 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:23:56 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:23:56 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:23:56 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Rebuilding the list of instances to heal {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 16:23:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "refresh_cache-cabd55bf-46c4-41be-942d-b6563f6b2778" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:23:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquired lock "refresh_cache-cabd55bf-46c4-41be-942d-b6563f6b2778" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:23:56 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Forcefully refreshing network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 16:23:56 user nova-compute[71605]: DEBUG nova.objects.instance [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lazy-loading 'info_cache' on Instance uuid cabd55bf-46c4-41be-942d-b6563f6b2778 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:23:56 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Updating instance_info_cache with network_info: [{"id": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "address": "fa:16:3e:5f:14:92", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap51a9dc4c-8c", "ovs_interfaceid": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:23:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Releasing lock "refresh_cache-cabd55bf-46c4-41be-942d-b6563f6b2778" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:23:56 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Updated the network info_cache for instance {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 16:23:56 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:23:57 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:23:58 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:24:00 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:24:00 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:24:00 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:24:01 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:24:02 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:24:03 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:24:03 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:24:03 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Cleaning up deleted instances {{(pid=71605) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 20 16:24:03 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] There are 0 instances to clean {{(pid=71605) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 20 16:24:06 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:24:08 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:24:11 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:24:11 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Cleaning up deleted instances with incomplete migration {{(pid=71605) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 20 16:24:13 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:24:13 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:24:13 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:24:13 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:24:13 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:24:13 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:24:13 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-44fd8f1c-8595-4b83-b310-a38174712c58 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquiring lock "cabd55bf-46c4-41be-942d-b6563f6b2778" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:24:13 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-44fd8f1c-8595-4b83-b310-a38174712c58 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "cabd55bf-46c4-41be-942d-b6563f6b2778" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:24:13 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-44fd8f1c-8595-4b83-b310-a38174712c58 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquiring lock "cabd55bf-46c4-41be-942d-b6563f6b2778-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:24:13 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-44fd8f1c-8595-4b83-b310-a38174712c58 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "cabd55bf-46c4-41be-942d-b6563f6b2778-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:24:13 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-44fd8f1c-8595-4b83-b310-a38174712c58 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "cabd55bf-46c4-41be-942d-b6563f6b2778-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:24:13 user nova-compute[71605]: INFO nova.compute.manager [None req-44fd8f1c-8595-4b83-b310-a38174712c58 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Terminating instance Apr 20 16:24:13 user nova-compute[71605]: DEBUG nova.compute.manager [None req-44fd8f1c-8595-4b83-b310-a38174712c58 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Start destroying the instance on the hypervisor. {{(pid=71605) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 16:24:13 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:24:13 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:24:13 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:24:13 user nova-compute[71605]: DEBUG nova.compute.manager [req-ce5ec0db-6422-4fe5-aec9-b450c5558266 req-6426e617-aa35-4850-8db8-1de8fc99b2b3 service nova] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Received event network-vif-unplugged-51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:24:13 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-ce5ec0db-6422-4fe5-aec9-b450c5558266 req-6426e617-aa35-4850-8db8-1de8fc99b2b3 service nova] Acquiring lock "cabd55bf-46c4-41be-942d-b6563f6b2778-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:24:13 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-ce5ec0db-6422-4fe5-aec9-b450c5558266 req-6426e617-aa35-4850-8db8-1de8fc99b2b3 service nova] Lock "cabd55bf-46c4-41be-942d-b6563f6b2778-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:24:13 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-ce5ec0db-6422-4fe5-aec9-b450c5558266 req-6426e617-aa35-4850-8db8-1de8fc99b2b3 service nova] Lock "cabd55bf-46c4-41be-942d-b6563f6b2778-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:24:13 user nova-compute[71605]: DEBUG nova.compute.manager [req-ce5ec0db-6422-4fe5-aec9-b450c5558266 req-6426e617-aa35-4850-8db8-1de8fc99b2b3 service nova] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] No waiting events found dispatching network-vif-unplugged-51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:24:13 user nova-compute[71605]: DEBUG nova.compute.manager [req-ce5ec0db-6422-4fe5-aec9-b450c5558266 req-6426e617-aa35-4850-8db8-1de8fc99b2b3 service nova] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Received event network-vif-unplugged-51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff for instance with task_state deleting. {{(pid=71605) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 16:24:13 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:24:14 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Instance destroyed successfully. Apr 20 16:24:14 user nova-compute[71605]: DEBUG nova.objects.instance [None req-44fd8f1c-8595-4b83-b310-a38174712c58 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lazy-loading 'resources' on Instance uuid cabd55bf-46c4-41be-942d-b6563f6b2778 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:24:14 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-44fd8f1c-8595-4b83-b310-a38174712c58 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:15:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1055967206',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1055967206',id=20,image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-20T16:15:51Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='8f978ad5201e412894f30daa8e2bd2e8',ramdisk_id='',reservation_id='r-0vah7prs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='4ac69ea5-e5d7-40c8-864e-0a164d78a727',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-2108053043',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T16:17:41Z,user_data=None,user_id='8c79a05e12ae4aab91bc79d32b02ef46',uuid=cabd55bf-46c4-41be-942d-b6563f6b2778,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "address": "fa:16:3e:5f:14:92", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap51a9dc4c-8c", "ovs_interfaceid": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 16:24:14 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-44fd8f1c-8595-4b83-b310-a38174712c58 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Converting VIF {"id": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "address": "fa:16:3e:5f:14:92", "network": {"id": "110d8e20-360f-48b7-8b42-9ae9760d39b8", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-1089439276-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "8f978ad5201e412894f30daa8e2bd2e8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap51a9dc4c-8c", "ovs_interfaceid": "51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:24:14 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-44fd8f1c-8595-4b83-b310-a38174712c58 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5f:14:92,bridge_name='br-int',has_traffic_filtering=True,id=51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff,network=Network(110d8e20-360f-48b7-8b42-9ae9760d39b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51a9dc4c-8c') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:24:14 user nova-compute[71605]: DEBUG os_vif [None req-44fd8f1c-8595-4b83-b310-a38174712c58 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:14:92,bridge_name='br-int',has_traffic_filtering=True,id=51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff,network=Network(110d8e20-360f-48b7-8b42-9ae9760d39b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51a9dc4c-8c') {{(pid=71605) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 16:24:14 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:24:14 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap51a9dc4c-8c, bridge=br-int, if_exists=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:24:14 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:24:14 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:24:14 user nova-compute[71605]: INFO os_vif [None req-44fd8f1c-8595-4b83-b310-a38174712c58 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:14:92,bridge_name='br-int',has_traffic_filtering=True,id=51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff,network=Network(110d8e20-360f-48b7-8b42-9ae9760d39b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51a9dc4c-8c') Apr 20 16:24:14 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-44fd8f1c-8595-4b83-b310-a38174712c58 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Deleting instance files /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778_del Apr 20 16:24:14 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-44fd8f1c-8595-4b83-b310-a38174712c58 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Deletion of /opt/stack/data/nova/instances/cabd55bf-46c4-41be-942d-b6563f6b2778_del complete Apr 20 16:24:14 user nova-compute[71605]: INFO nova.compute.manager [None req-44fd8f1c-8595-4b83-b310-a38174712c58 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Took 0.86 seconds to destroy the instance on the hypervisor. Apr 20 16:24:14 user nova-compute[71605]: DEBUG oslo.service.loopingcall [None req-44fd8f1c-8595-4b83-b310-a38174712c58 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71605) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 16:24:14 user nova-compute[71605]: DEBUG nova.compute.manager [-] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Deallocating network for instance {{(pid=71605) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 16:24:14 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] deallocate_for_instance() {{(pid=71605) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 16:24:14 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:24:14 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Took 0.48 seconds to deallocate network for instance. Apr 20 16:24:14 user nova-compute[71605]: DEBUG nova.compute.manager [req-94692dde-f7d6-4f8f-8654-a9a1f6ef00f9 req-b7349462-fa73-4480-944b-4da4bc696f3b service nova] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Received event network-vif-deleted-51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:24:14 user nova-compute[71605]: INFO nova.compute.manager [req-94692dde-f7d6-4f8f-8654-a9a1f6ef00f9 req-b7349462-fa73-4480-944b-4da4bc696f3b service nova] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Neutron deleted interface 51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff; detaching it from the instance and deleting it from the info cache Apr 20 16:24:14 user nova-compute[71605]: DEBUG nova.network.neutron [req-94692dde-f7d6-4f8f-8654-a9a1f6ef00f9 req-b7349462-fa73-4480-944b-4da4bc696f3b service nova] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:24:14 user nova-compute[71605]: DEBUG nova.compute.manager [req-94692dde-f7d6-4f8f-8654-a9a1f6ef00f9 req-b7349462-fa73-4480-944b-4da4bc696f3b service nova] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Detach interface failed, port_id=51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff, reason: Instance cabd55bf-46c4-41be-942d-b6563f6b2778 could not be found. {{(pid=71605) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 20 16:24:14 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-44fd8f1c-8595-4b83-b310-a38174712c58 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:24:14 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-44fd8f1c-8595-4b83-b310-a38174712c58 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:24:14 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-44fd8f1c-8595-4b83-b310-a38174712c58 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:24:14 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-44fd8f1c-8595-4b83-b310-a38174712c58 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:24:14 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-44fd8f1c-8595-4b83-b310-a38174712c58 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.110s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:24:14 user nova-compute[71605]: INFO nova.scheduler.client.report [None req-44fd8f1c-8595-4b83-b310-a38174712c58 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Deleted allocations for instance cabd55bf-46c4-41be-942d-b6563f6b2778 Apr 20 16:24:14 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-44fd8f1c-8595-4b83-b310-a38174712c58 tempest-ServerBootFromVolumeStableRescueTest-2108053043 tempest-ServerBootFromVolumeStableRescueTest-2108053043-project-member] Lock "cabd55bf-46c4-41be-942d-b6563f6b2778" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.622s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:24:15 user nova-compute[71605]: DEBUG nova.compute.manager [req-ac9e2c7e-2e13-4481-809f-966aefe0b118 req-e1a37c6d-3b2b-4d26-9c41-fbcf47f24eb5 service nova] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Received event network-vif-plugged-51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:24:15 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-ac9e2c7e-2e13-4481-809f-966aefe0b118 req-e1a37c6d-3b2b-4d26-9c41-fbcf47f24eb5 service nova] Acquiring lock "cabd55bf-46c4-41be-942d-b6563f6b2778-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:24:15 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-ac9e2c7e-2e13-4481-809f-966aefe0b118 req-e1a37c6d-3b2b-4d26-9c41-fbcf47f24eb5 service nova] Lock "cabd55bf-46c4-41be-942d-b6563f6b2778-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:24:15 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-ac9e2c7e-2e13-4481-809f-966aefe0b118 req-e1a37c6d-3b2b-4d26-9c41-fbcf47f24eb5 service nova] Lock "cabd55bf-46c4-41be-942d-b6563f6b2778-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:24:15 user nova-compute[71605]: DEBUG nova.compute.manager [req-ac9e2c7e-2e13-4481-809f-966aefe0b118 req-e1a37c6d-3b2b-4d26-9c41-fbcf47f24eb5 service nova] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] No waiting events found dispatching network-vif-plugged-51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:24:15 user nova-compute[71605]: WARNING nova.compute.manager [req-ac9e2c7e-2e13-4481-809f-966aefe0b118 req-e1a37c6d-3b2b-4d26-9c41-fbcf47f24eb5 service nova] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Received unexpected event network-vif-plugged-51a9dc4c-8c53-4b2d-b20e-aca8c3c70bff for instance with vm_state deleted and task_state None. Apr 20 16:24:19 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:24:24 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:24:29 user nova-compute[71605]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:24:29 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] VM Stopped (Lifecycle Event) Apr 20 16:24:29 user nova-compute[71605]: DEBUG nova.compute.manager [None req-15e85064-b281-49f5-b596-b0292edb539f None None] [instance: cabd55bf-46c4-41be-942d-b6563f6b2778] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:24:29 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:24:29 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:24:29 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:24:29 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:24:29 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:24:29 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:24:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:24:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:24:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:24:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:24:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:24:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:24:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:24:44 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:24:49 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:24:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:24:55 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:24:55 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:24:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:24:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:24:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:24:55 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:24:55 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:24:55 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:24:55 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=9206MB free_disk=26.361949920654297GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:24:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:24:55 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:24:55 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:24:55 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:24:55 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Refreshing inventories for resource provider 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 20 16:24:55 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Updating ProviderTree inventory for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 20 16:24:55 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Updating inventory in ProviderTree for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 20 16:24:55 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Refreshing aggregate associations for resource provider 00e9f769-1a1c-4f1e-80e4-b19657803102, aggregates: None {{(pid=71605) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 20 16:24:55 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Refreshing trait associations for resource provider 00e9f769-1a1c-4f1e-80e4-b19657803102, traits: COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_MMX,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI {{(pid=71605) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 20 16:24:55 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:24:55 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:24:56 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:24:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.306s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:24:59 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:24:59 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:24:59 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Rebuilding the list of instances to heal {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 16:24:59 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Didn't find any instances for network info cache update. {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 20 16:24:59 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:24:59 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:24:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:24:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:24:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:24:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:24:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:24:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:01 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:25:01 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:25:01 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:25:02 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:25:02 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:25:04 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:05 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:05 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:06 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:25:09 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:14 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:25:19 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:25:24 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:25:29 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:25:29 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:25:29 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:25:29 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:25:29 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:25:29 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:30 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:33 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:37 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:40 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:43 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Acquiring lock "b3efdbb9-b302-44db-bba1-32181ad4e70d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:25:43 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "b3efdbb9-b302-44db-bba1-32181ad4e70d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:25:43 user nova-compute[71605]: DEBUG nova.compute.manager [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Starting instance... {{(pid=71605) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 16:25:43 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:25:43 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:25:43 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71605) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 16:25:43 user nova-compute[71605]: INFO nova.compute.claims [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Claim successful on node user Apr 20 16:25:43 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:25:43 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:25:43 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.235s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:25:43 user nova-compute[71605]: DEBUG nova.compute.manager [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Start building networks asynchronously for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 16:25:44 user nova-compute[71605]: DEBUG nova.compute.manager [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Allocating IP information in the background. {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 16:25:44 user nova-compute[71605]: DEBUG nova.network.neutron [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] allocate_for_instance() {{(pid=71605) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 16:25:44 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 16:25:44 user nova-compute[71605]: DEBUG nova.compute.manager [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Start building block device mappings for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 16:25:44 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:44 user nova-compute[71605]: DEBUG nova.policy [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f4ec42233fc040d0bef4f2e408a561b7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1558b18ec4304868ad6d8c61b5525d55', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71605) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 16:25:44 user nova-compute[71605]: DEBUG nova.compute.manager [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Start spawning the instance on the hypervisor. {{(pid=71605) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 16:25:44 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Creating instance directory {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 16:25:44 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Creating image(s) Apr 20 16:25:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Acquiring lock "/opt/stack/data/nova/instances/b3efdbb9-b302-44db-bba1-32181ad4e70d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:25:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "/opt/stack/data/nova/instances/b3efdbb9-b302-44db-bba1-32181ad4e70d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:25:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "/opt/stack/data/nova/instances/b3efdbb9-b302-44db-bba1-32181ad4e70d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:25:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Acquiring lock "885fb30ae2285948f2d5070f26422e3b387e9c1f" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:25:44 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "885fb30ae2285948f2d5070f26422e3b387e9c1f" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.003s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:25:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/885fb30ae2285948f2d5070f26422e3b387e9c1f.part --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:25:44 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/885fb30ae2285948f2d5070f26422e3b387e9c1f.part --force-share --output=json" returned: 0 in 0.137s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:25:44 user nova-compute[71605]: DEBUG nova.virt.images [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] 831fa3c6-3f0e-4cb3-88a3-dda89b7ed63f was qcow2, converting to raw {{(pid=71605) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 20 16:25:44 user nova-compute[71605]: DEBUG nova.privsep.utils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71605) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 20 16:25:44 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/885fb30ae2285948f2d5070f26422e3b387e9c1f.part /opt/stack/data/nova/instances/_base/885fb30ae2285948f2d5070f26422e3b387e9c1f.converted {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:25:44 user nova-compute[71605]: DEBUG nova.network.neutron [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Successfully created port: ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729 {{(pid=71605) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/885fb30ae2285948f2d5070f26422e3b387e9c1f.part /opt/stack/data/nova/instances/_base/885fb30ae2285948f2d5070f26422e3b387e9c1f.converted" returned: 0 in 0.111s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/885fb30ae2285948f2d5070f26422e3b387e9c1f.converted --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/885fb30ae2285948f2d5070f26422e3b387e9c1f.converted --force-share --output=json" returned: 0 in 0.138s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "885fb30ae2285948f2d5070f26422e3b387e9c1f" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.956s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/885fb30ae2285948f2d5070f26422e3b387e9c1f --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/885fb30ae2285948f2d5070f26422e3b387e9c1f --force-share --output=json" returned: 0 in 0.127s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Acquiring lock "885fb30ae2285948f2d5070f26422e3b387e9c1f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "885fb30ae2285948f2d5070f26422e3b387e9c1f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/885fb30ae2285948f2d5070f26422e3b387e9c1f --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/885fb30ae2285948f2d5070f26422e3b387e9c1f --force-share --output=json" returned: 0 in 0.144s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/885fb30ae2285948f2d5070f26422e3b387e9c1f,backing_fmt=raw /opt/stack/data/nova/instances/b3efdbb9-b302-44db-bba1-32181ad4e70d/disk 1073741824 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/885fb30ae2285948f2d5070f26422e3b387e9c1f,backing_fmt=raw /opt/stack/data/nova/instances/b3efdbb9-b302-44db-bba1-32181ad4e70d/disk 1073741824" returned: 0 in 0.044s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "885fb30ae2285948f2d5070f26422e3b387e9c1f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.195s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/885fb30ae2285948f2d5070f26422e3b387e9c1f --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/885fb30ae2285948f2d5070f26422e3b387e9c1f --force-share --output=json" returned: 0 in 0.126s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Checking if we can resize image /opt/stack/data/nova/instances/b3efdbb9-b302-44db-bba1-32181ad4e70d/disk. size=1073741824 {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b3efdbb9-b302-44db-bba1-32181ad4e70d/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG nova.network.neutron [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Successfully updated port: ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729 {{(pid=71605) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG nova.compute.manager [req-ccbac26c-e0f1-45c4-9bf6-3322ab1b8343 req-7f47381a-aa55-4e79-9a6d-2a471c92cd80 service nova] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Received event network-changed-ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG nova.compute.manager [req-ccbac26c-e0f1-45c4-9bf6-3322ab1b8343 req-7f47381a-aa55-4e79-9a6d-2a471c92cd80 service nova] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Refreshing instance network info cache due to event network-changed-ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-ccbac26c-e0f1-45c4-9bf6-3322ab1b8343 req-7f47381a-aa55-4e79-9a6d-2a471c92cd80 service nova] Acquiring lock "refresh_cache-b3efdbb9-b302-44db-bba1-32181ad4e70d" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-ccbac26c-e0f1-45c4-9bf6-3322ab1b8343 req-7f47381a-aa55-4e79-9a6d-2a471c92cd80 service nova] Acquired lock "refresh_cache-b3efdbb9-b302-44db-bba1-32181ad4e70d" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG nova.network.neutron [req-ccbac26c-e0f1-45c4-9bf6-3322ab1b8343 req-7f47381a-aa55-4e79-9a6d-2a471c92cd80 service nova] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Refreshing network info cache for port ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729 {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Acquiring lock "refresh_cache-b3efdbb9-b302-44db-bba1-32181ad4e70d" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG nova.network.neutron [req-ccbac26c-e0f1-45c4-9bf6-3322ab1b8343 req-7f47381a-aa55-4e79-9a6d-2a471c92cd80 service nova] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Instance cache missing network info. {{(pid=71605) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b3efdbb9-b302-44db-bba1-32181ad4e70d/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Cannot resize image /opt/stack/data/nova/instances/b3efdbb9-b302-44db-bba1-32181ad4e70d/disk to a smaller size. {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG nova.objects.instance [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lazy-loading 'migration_context' on Instance uuid b3efdbb9-b302-44db-bba1-32181ad4e70d {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Created local disks {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Ensure instance console log exists: /opt/stack/data/nova/instances/b3efdbb9-b302-44db-bba1-32181ad4e70d/console.log {{(pid=71605) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG nova.network.neutron [req-ccbac26c-e0f1-45c4-9bf6-3322ab1b8343 req-7f47381a-aa55-4e79-9a6d-2a471c92cd80 service nova] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-ccbac26c-e0f1-45c4-9bf6-3322ab1b8343 req-7f47381a-aa55-4e79-9a6d-2a471c92cd80 service nova] Releasing lock "refresh_cache-b3efdbb9-b302-44db-bba1-32181ad4e70d" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Acquired lock "refresh_cache-b3efdbb9-b302-44db-bba1-32181ad4e70d" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG nova.network.neutron [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Building network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 16:25:45 user nova-compute[71605]: DEBUG nova.network.neutron [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Instance cache missing network info. {{(pid=71605) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG nova.network.neutron [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Updating instance_info_cache with network_info: [{"id": "ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729", "address": "fa:16:3e:e7:ae:35", "network": {"id": "b19664d3-6727-47cf-81c3-ee6ea3992cb8", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-498816505-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1558b18ec4304868ad6d8c61b5525d55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapccb48d0c-5b", "ovs_interfaceid": "ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Releasing lock "refresh_cache-b3efdbb9-b302-44db-bba1-32181ad4e70d" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG nova.compute.manager [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Instance network_info: |[{"id": "ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729", "address": "fa:16:3e:e7:ae:35", "network": {"id": "b19664d3-6727-47cf-81c3-ee6ea3992cb8", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-498816505-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1558b18ec4304868ad6d8c61b5525d55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapccb48d0c-5b", "ovs_interfaceid": "ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Start _get_guest_xml network_info=[{"id": "ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729", "address": "fa:16:3e:e7:ae:35", "network": {"id": "b19664d3-6727-47cf-81c3-ee6ea3992cb8", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-498816505-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1558b18ec4304868ad6d8c61b5525d55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapccb48d0c-5b", "ovs_interfaceid": "ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T16:25:41Z,direct_url=,disk_format='qcow2',id=831fa3c6-3f0e-4cb3-88a3-dda89b7ed63f,min_disk=0,min_ram=0,name='tempest-scenario-img--2004798390',owner='1558b18ec4304868ad6d8c61b5525d55',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T16:25:42Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'size': 0, 'encryption_format': None, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'image_id': '831fa3c6-3f0e-4cb3-88a3-dda89b7ed63f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 16:25:46 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:25:46 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:25:46 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71605) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T16:00:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T16:25:41Z,direct_url=,disk_format='qcow2',id=831fa3c6-3f0e-4cb3-88a3-dda89b7ed63f,min_disk=0,min_ram=0,name='tempest-scenario-img--2004798390',owner='1558b18ec4304868ad6d8c61b5525d55',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T16:25:42Z,virtual_size=,visibility=), allow threads: True {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Flavor limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Image limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Flavor pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Image pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Got 1 possible topologies {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:25:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1941398318',display_name='tempest-TestMinimumBasicScenario-server-1941398318',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1941398318',id=24,image_ref='831fa3c6-3f0e-4cb3-88a3-dda89b7ed63f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLG80KFXisUW1wgnTLUsZOsmfwb0L2g+7zVWJHjlgBov1H5d5fA/SYjooY3obdw4LI3DgJUhFqtlNsqpLQfIe2UgxJvi7QEmnu2VryHUuLpN882Z/fgkNNzTXR8n7LE8gw==',key_name='tempest-TestMinimumBasicScenario-880855576',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1558b18ec4304868ad6d8c61b5525d55',ramdisk_id='',reservation_id='r-jxjfc5zu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='831fa3c6-3f0e-4cb3-88a3-dda89b7ed63f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-1763718283',owner_user_name='tempest-TestMinimumBasicScenario-1763718283-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:25:44Z,user_data=None,user_id='f4ec42233fc040d0bef4f2e408a561b7',uuid=b3efdbb9-b302-44db-bba1-32181ad4e70d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729", "address": "fa:16:3e:e7:ae:35", "network": {"id": "b19664d3-6727-47cf-81c3-ee6ea3992cb8", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-498816505-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1558b18ec4304868ad6d8c61b5525d55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapccb48d0c-5b", "ovs_interfaceid": "ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71605) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Converting VIF {"id": "ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729", "address": "fa:16:3e:e7:ae:35", "network": {"id": "b19664d3-6727-47cf-81c3-ee6ea3992cb8", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-498816505-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1558b18ec4304868ad6d8c61b5525d55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapccb48d0c-5b", "ovs_interfaceid": "ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:ae:35,bridge_name='br-int',has_traffic_filtering=True,id=ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729,network=Network(b19664d3-6727-47cf-81c3-ee6ea3992cb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapccb48d0c-5b') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG nova.objects.instance [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lazy-loading 'pci_devices' on Instance uuid b3efdbb9-b302-44db-bba1-32181ad4e70d {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] End _get_guest_xml xml= Apr 20 16:25:46 user nova-compute[71605]: b3efdbb9-b302-44db-bba1-32181ad4e70d Apr 20 16:25:46 user nova-compute[71605]: instance-00000018 Apr 20 16:25:46 user nova-compute[71605]: 131072 Apr 20 16:25:46 user nova-compute[71605]: 1 Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: tempest-TestMinimumBasicScenario-server-1941398318 Apr 20 16:25:46 user nova-compute[71605]: 2023-04-20 16:25:46 Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: 128 Apr 20 16:25:46 user nova-compute[71605]: 1 Apr 20 16:25:46 user nova-compute[71605]: 0 Apr 20 16:25:46 user nova-compute[71605]: 0 Apr 20 16:25:46 user nova-compute[71605]: 1 Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: tempest-TestMinimumBasicScenario-1763718283-project-member Apr 20 16:25:46 user nova-compute[71605]: tempest-TestMinimumBasicScenario-1763718283 Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: OpenStack Foundation Apr 20 16:25:46 user nova-compute[71605]: OpenStack Nova Apr 20 16:25:46 user nova-compute[71605]: 0.0.0 Apr 20 16:25:46 user nova-compute[71605]: b3efdbb9-b302-44db-bba1-32181ad4e70d Apr 20 16:25:46 user nova-compute[71605]: b3efdbb9-b302-44db-bba1-32181ad4e70d Apr 20 16:25:46 user nova-compute[71605]: Virtual Machine Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: hvm Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Nehalem Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: /dev/urandom Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: Apr 20 16:25:46 user nova-compute[71605]: {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:25:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1941398318',display_name='tempest-TestMinimumBasicScenario-server-1941398318',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1941398318',id=24,image_ref='831fa3c6-3f0e-4cb3-88a3-dda89b7ed63f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLG80KFXisUW1wgnTLUsZOsmfwb0L2g+7zVWJHjlgBov1H5d5fA/SYjooY3obdw4LI3DgJUhFqtlNsqpLQfIe2UgxJvi7QEmnu2VryHUuLpN882Z/fgkNNzTXR8n7LE8gw==',key_name='tempest-TestMinimumBasicScenario-880855576',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1558b18ec4304868ad6d8c61b5525d55',ramdisk_id='',reservation_id='r-jxjfc5zu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='831fa3c6-3f0e-4cb3-88a3-dda89b7ed63f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-1763718283',owner_user_name='tempest-TestMinimumBasicScenario-1763718283-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:25:44Z,user_data=None,user_id='f4ec42233fc040d0bef4f2e408a561b7',uuid=b3efdbb9-b302-44db-bba1-32181ad4e70d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729", "address": "fa:16:3e:e7:ae:35", "network": {"id": "b19664d3-6727-47cf-81c3-ee6ea3992cb8", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-498816505-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1558b18ec4304868ad6d8c61b5525d55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapccb48d0c-5b", "ovs_interfaceid": "ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Converting VIF {"id": "ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729", "address": "fa:16:3e:e7:ae:35", "network": {"id": "b19664d3-6727-47cf-81c3-ee6ea3992cb8", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-498816505-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1558b18ec4304868ad6d8c61b5525d55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapccb48d0c-5b", "ovs_interfaceid": "ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e7:ae:35,bridge_name='br-int',has_traffic_filtering=True,id=ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729,network=Network(b19664d3-6727-47cf-81c3-ee6ea3992cb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapccb48d0c-5b') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG os_vif [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:ae:35,bridge_name='br-int',has_traffic_filtering=True,id=ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729,network=Network(b19664d3-6727-47cf-81c3-ee6ea3992cb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapccb48d0c-5b') {{(pid=71605) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapccb48d0c-5b, may_exist=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapccb48d0c-5b, col_values=(('external_ids', {'iface-id': 'ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e7:ae:35', 'vm-uuid': 'b3efdbb9-b302-44db-bba1-32181ad4e70d'}),)) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:46 user nova-compute[71605]: INFO os_vif [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e7:ae:35,bridge_name='br-int',has_traffic_filtering=True,id=ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729,network=Network(b19664d3-6727-47cf-81c3-ee6ea3992cb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapccb48d0c-5b') Apr 20 16:25:46 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] No BDM found with device name vda, not building metadata. {{(pid=71605) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 16:25:46 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] No VIF found with MAC fa:16:3e:e7:ae:35, not building metadata {{(pid=71605) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 16:25:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:48 user nova-compute[71605]: DEBUG nova.compute.manager [req-87f0602e-4131-4c50-b264-ff38cc496a96 req-6c284bc4-04a2-4640-816b-e2bc0b2a9207 service nova] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Received event network-vif-plugged-ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:25:48 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-87f0602e-4131-4c50-b264-ff38cc496a96 req-6c284bc4-04a2-4640-816b-e2bc0b2a9207 service nova] Acquiring lock "b3efdbb9-b302-44db-bba1-32181ad4e70d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:25:48 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-87f0602e-4131-4c50-b264-ff38cc496a96 req-6c284bc4-04a2-4640-816b-e2bc0b2a9207 service nova] Lock "b3efdbb9-b302-44db-bba1-32181ad4e70d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:25:48 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-87f0602e-4131-4c50-b264-ff38cc496a96 req-6c284bc4-04a2-4640-816b-e2bc0b2a9207 service nova] Lock "b3efdbb9-b302-44db-bba1-32181ad4e70d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:25:48 user nova-compute[71605]: DEBUG nova.compute.manager [req-87f0602e-4131-4c50-b264-ff38cc496a96 req-6c284bc4-04a2-4640-816b-e2bc0b2a9207 service nova] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] No waiting events found dispatching network-vif-plugged-ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:25:48 user nova-compute[71605]: WARNING nova.compute.manager [req-87f0602e-4131-4c50-b264-ff38cc496a96 req-6c284bc4-04a2-4640-816b-e2bc0b2a9207 service nova] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Received unexpected event network-vif-plugged-ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729 for instance with vm_state building and task_state spawning. Apr 20 16:25:48 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:48 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:48 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:49 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:49 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Resumed> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:25:49 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] VM Resumed (Lifecycle Event) Apr 20 16:25:49 user nova-compute[71605]: DEBUG nova.compute.manager [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Instance event wait completed in 0 seconds for {{(pid=71605) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 16:25:49 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Guest created on hypervisor {{(pid=71605) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 16:25:49 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Instance spawned successfully. Apr 20 16:25:49 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 16:25:49 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:25:49 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:25:49 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Found default for hw_cdrom_bus of ide {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:25:49 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Found default for hw_disk_bus of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:25:49 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Found default for hw_input_bus of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:25:49 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Found default for hw_pointer_model of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:25:49 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Found default for hw_video_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:25:49 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Found default for hw_vif_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:25:49 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:25:49 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Started> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:25:49 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] VM Started (Lifecycle Event) Apr 20 16:25:49 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:25:49 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:25:49 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:25:49 user nova-compute[71605]: INFO nova.compute.manager [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Took 5.75 seconds to spawn the instance on the hypervisor. Apr 20 16:25:49 user nova-compute[71605]: DEBUG nova.compute.manager [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:25:50 user nova-compute[71605]: INFO nova.compute.manager [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Took 6.28 seconds to build instance. Apr 20 16:25:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0c868f4b-07d0-41fe-9378-2e5b4bb92eea tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "b3efdbb9-b302-44db-bba1-32181ad4e70d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.385s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:25:50 user nova-compute[71605]: DEBUG nova.compute.manager [req-ceaaad21-4cfe-41ae-9050-cbf028867e6e req-c04b86ed-5030-4bd9-bfcd-47af749c1753 service nova] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Received event network-vif-plugged-ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:25:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-ceaaad21-4cfe-41ae-9050-cbf028867e6e req-c04b86ed-5030-4bd9-bfcd-47af749c1753 service nova] Acquiring lock "b3efdbb9-b302-44db-bba1-32181ad4e70d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:25:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-ceaaad21-4cfe-41ae-9050-cbf028867e6e req-c04b86ed-5030-4bd9-bfcd-47af749c1753 service nova] Lock "b3efdbb9-b302-44db-bba1-32181ad4e70d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:25:50 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-ceaaad21-4cfe-41ae-9050-cbf028867e6e req-c04b86ed-5030-4bd9-bfcd-47af749c1753 service nova] Lock "b3efdbb9-b302-44db-bba1-32181ad4e70d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:25:50 user nova-compute[71605]: DEBUG nova.compute.manager [req-ceaaad21-4cfe-41ae-9050-cbf028867e6e req-c04b86ed-5030-4bd9-bfcd-47af749c1753 service nova] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] No waiting events found dispatching network-vif-plugged-ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:25:50 user nova-compute[71605]: WARNING nova.compute.manager [req-ceaaad21-4cfe-41ae-9050-cbf028867e6e req-c04b86ed-5030-4bd9-bfcd-47af749c1753 service nova] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Received unexpected event network-vif-plugged-ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729 for instance with vm_state active and task_state None. Apr 20 16:25:51 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:55 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:25:56 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:25:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:25:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:25:56 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:25:56 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:25:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:25:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b3efdbb9-b302-44db-bba1-32181ad4e70d/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:25:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b3efdbb9-b302-44db-bba1-32181ad4e70d/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:25:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b3efdbb9-b302-44db-bba1-32181ad4e70d/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:25:56 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b3efdbb9-b302-44db-bba1-32181ad4e70d/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:25:57 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:25:57 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:25:57 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=9050MB free_disk=26.303680419921875GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:25:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:25:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:25:57 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance b3efdbb9-b302-44db-bba1-32181ad4e70d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:25:57 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:25:57 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:25:57 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:25:57 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:25:57 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:25:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.200s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:25:59 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:25:59 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:26:00 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:26:00 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:26:00 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Rebuilding the list of instances to heal {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 16:26:00 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "refresh_cache-b3efdbb9-b302-44db-bba1-32181ad4e70d" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:26:00 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquired lock "refresh_cache-b3efdbb9-b302-44db-bba1-32181ad4e70d" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:26:00 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Forcefully refreshing network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 16:26:00 user nova-compute[71605]: DEBUG nova.objects.instance [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lazy-loading 'info_cache' on Instance uuid b3efdbb9-b302-44db-bba1-32181ad4e70d {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:26:00 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Updating instance_info_cache with network_info: [{"id": "ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729", "address": "fa:16:3e:e7:ae:35", "network": {"id": "b19664d3-6727-47cf-81c3-ee6ea3992cb8", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-498816505-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1558b18ec4304868ad6d8c61b5525d55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapccb48d0c-5b", "ovs_interfaceid": "ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:26:00 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Releasing lock "refresh_cache-b3efdbb9-b302-44db-bba1-32181ad4e70d" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:26:00 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Updated the network info_cache for instance {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 16:26:01 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:26:01 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:26:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:26:03 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:26:04 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:26:04 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:26:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:26:07 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:26:09 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:26:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:26:14 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:26:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:26:19 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:26:21 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:26:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:26:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:26:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:26:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:26:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:26:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:26:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:26:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:26:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:26:41 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:26:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:26:51 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:26:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:26:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:26:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:26:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:26:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:26:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:26:57 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:26:57 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:26:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:26:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:26:57 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:26:57 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:26:57 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b3efdbb9-b302-44db-bba1-32181ad4e70d/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:26:57 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b3efdbb9-b302-44db-bba1-32181ad4e70d/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:26:57 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b3efdbb9-b302-44db-bba1-32181ad4e70d/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:26:57 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b3efdbb9-b302-44db-bba1-32181ad4e70d/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:26:58 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:26:58 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:26:58 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=9071MB free_disk=26.283382415771484GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:26:58 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:26:58 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:26:58 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance b3efdbb9-b302-44db-bba1-32181ad4e70d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:26:58 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:26:58 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:26:58 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:26:58 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:26:58 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:26:58 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.155s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:27:01 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:27:01 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:27:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:27:02 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:27:02 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:27:02 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Rebuilding the list of instances to heal {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 16:27:02 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "refresh_cache-b3efdbb9-b302-44db-bba1-32181ad4e70d" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:27:02 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquired lock "refresh_cache-b3efdbb9-b302-44db-bba1-32181ad4e70d" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:27:02 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Forcefully refreshing network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 16:27:02 user nova-compute[71605]: DEBUG nova.objects.instance [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lazy-loading 'info_cache' on Instance uuid b3efdbb9-b302-44db-bba1-32181ad4e70d {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:27:02 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Updating instance_info_cache with network_info: [{"id": "ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729", "address": "fa:16:3e:e7:ae:35", "network": {"id": "b19664d3-6727-47cf-81c3-ee6ea3992cb8", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-498816505-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1558b18ec4304868ad6d8c61b5525d55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapccb48d0c-5b", "ovs_interfaceid": "ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:27:02 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Releasing lock "refresh_cache-b3efdbb9-b302-44db-bba1-32181ad4e70d" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:27:02 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Updated the network info_cache for instance {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 16:27:02 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:27:02 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:27:03 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:27:03 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:27:06 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:27:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:27:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:27:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:27:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:27:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:27:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:27:07 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:27:09 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:27:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:27:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:27:21 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:27:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:27:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:27:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0490de03-4392-4e4f-b580-c5e68c6230ae tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Acquiring lock "b3efdbb9-b302-44db-bba1-32181ad4e70d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:27:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0490de03-4392-4e4f-b580-c5e68c6230ae tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "b3efdbb9-b302-44db-bba1-32181ad4e70d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:27:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0490de03-4392-4e4f-b580-c5e68c6230ae tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Acquiring lock "b3efdbb9-b302-44db-bba1-32181ad4e70d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:27:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0490de03-4392-4e4f-b580-c5e68c6230ae tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "b3efdbb9-b302-44db-bba1-32181ad4e70d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:27:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0490de03-4392-4e4f-b580-c5e68c6230ae tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "b3efdbb9-b302-44db-bba1-32181ad4e70d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:27:36 user nova-compute[71605]: INFO nova.compute.manager [None req-0490de03-4392-4e4f-b580-c5e68c6230ae tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Terminating instance Apr 20 16:27:36 user nova-compute[71605]: DEBUG nova.compute.manager [None req-0490de03-4392-4e4f-b580-c5e68c6230ae tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Start destroying the instance on the hypervisor. {{(pid=71605) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 16:27:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:27:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:27:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:27:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:27:36 user nova-compute[71605]: DEBUG nova.compute.manager [req-13b18a07-1109-451b-8c20-536203fbefa7 req-d8c5ab18-f055-42a8-a7f5-d47cdbb3a5d1 service nova] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Received event network-vif-unplugged-ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:27:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-13b18a07-1109-451b-8c20-536203fbefa7 req-d8c5ab18-f055-42a8-a7f5-d47cdbb3a5d1 service nova] Acquiring lock "b3efdbb9-b302-44db-bba1-32181ad4e70d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:27:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-13b18a07-1109-451b-8c20-536203fbefa7 req-d8c5ab18-f055-42a8-a7f5-d47cdbb3a5d1 service nova] Lock "b3efdbb9-b302-44db-bba1-32181ad4e70d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:27:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-13b18a07-1109-451b-8c20-536203fbefa7 req-d8c5ab18-f055-42a8-a7f5-d47cdbb3a5d1 service nova] Lock "b3efdbb9-b302-44db-bba1-32181ad4e70d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:27:36 user nova-compute[71605]: DEBUG nova.compute.manager [req-13b18a07-1109-451b-8c20-536203fbefa7 req-d8c5ab18-f055-42a8-a7f5-d47cdbb3a5d1 service nova] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] No waiting events found dispatching network-vif-unplugged-ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:27:36 user nova-compute[71605]: DEBUG nova.compute.manager [req-13b18a07-1109-451b-8c20-536203fbefa7 req-d8c5ab18-f055-42a8-a7f5-d47cdbb3a5d1 service nova] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Received event network-vif-unplugged-ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729 for instance with task_state deleting. {{(pid=71605) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 16:27:36 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Instance destroyed successfully. Apr 20 16:27:36 user nova-compute[71605]: DEBUG nova.objects.instance [None req-0490de03-4392-4e4f-b580-c5e68c6230ae tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lazy-loading 'resources' on Instance uuid b3efdbb9-b302-44db-bba1-32181ad4e70d {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:27:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-0490de03-4392-4e4f-b580-c5e68c6230ae tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:25:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1941398318',display_name='tempest-TestMinimumBasicScenario-server-1941398318',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1941398318',id=24,image_ref='831fa3c6-3f0e-4cb3-88a3-dda89b7ed63f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLG80KFXisUW1wgnTLUsZOsmfwb0L2g+7zVWJHjlgBov1H5d5fA/SYjooY3obdw4LI3DgJUhFqtlNsqpLQfIe2UgxJvi7QEmnu2VryHUuLpN882Z/fgkNNzTXR8n7LE8gw==',key_name='tempest-TestMinimumBasicScenario-880855576',keypairs=,launch_index=0,launched_at=2023-04-20T16:25:49Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='1558b18ec4304868ad6d8c61b5525d55',ramdisk_id='',reservation_id='r-jxjfc5zu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='831fa3c6-3f0e-4cb3-88a3-dda89b7ed63f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-1763718283',owner_user_name='tempest-TestMinimumBasicScenario-1763718283-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T16:25:50Z,user_data=None,user_id='f4ec42233fc040d0bef4f2e408a561b7',uuid=b3efdbb9-b302-44db-bba1-32181ad4e70d,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729", "address": "fa:16:3e:e7:ae:35", "network": {"id": "b19664d3-6727-47cf-81c3-ee6ea3992cb8", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-498816505-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1558b18ec4304868ad6d8c61b5525d55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapccb48d0c-5b", "ovs_interfaceid": "ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 16:27:36 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-0490de03-4392-4e4f-b580-c5e68c6230ae tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Converting VIF {"id": "ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729", "address": "fa:16:3e:e7:ae:35", "network": {"id": "b19664d3-6727-47cf-81c3-ee6ea3992cb8", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-498816505-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1558b18ec4304868ad6d8c61b5525d55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapccb48d0c-5b", "ovs_interfaceid": "ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:27:36 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-0490de03-4392-4e4f-b580-c5e68c6230ae tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e7:ae:35,bridge_name='br-int',has_traffic_filtering=True,id=ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729,network=Network(b19664d3-6727-47cf-81c3-ee6ea3992cb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapccb48d0c-5b') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:27:36 user nova-compute[71605]: DEBUG os_vif [None req-0490de03-4392-4e4f-b580-c5e68c6230ae tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e7:ae:35,bridge_name='br-int',has_traffic_filtering=True,id=ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729,network=Network(b19664d3-6727-47cf-81c3-ee6ea3992cb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapccb48d0c-5b') {{(pid=71605) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 16:27:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:27:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapccb48d0c-5b, bridge=br-int, if_exists=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:27:36 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:27:36 user nova-compute[71605]: INFO os_vif [None req-0490de03-4392-4e4f-b580-c5e68c6230ae tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e7:ae:35,bridge_name='br-int',has_traffic_filtering=True,id=ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729,network=Network(b19664d3-6727-47cf-81c3-ee6ea3992cb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapccb48d0c-5b') Apr 20 16:27:36 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-0490de03-4392-4e4f-b580-c5e68c6230ae tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Deleting instance files /opt/stack/data/nova/instances/b3efdbb9-b302-44db-bba1-32181ad4e70d_del Apr 20 16:27:36 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-0490de03-4392-4e4f-b580-c5e68c6230ae tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Deletion of /opt/stack/data/nova/instances/b3efdbb9-b302-44db-bba1-32181ad4e70d_del complete Apr 20 16:27:36 user nova-compute[71605]: INFO nova.compute.manager [None req-0490de03-4392-4e4f-b580-c5e68c6230ae tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Took 0.67 seconds to destroy the instance on the hypervisor. Apr 20 16:27:36 user nova-compute[71605]: DEBUG oslo.service.loopingcall [None req-0490de03-4392-4e4f-b580-c5e68c6230ae tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71605) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 16:27:36 user nova-compute[71605]: DEBUG nova.compute.manager [-] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Deallocating network for instance {{(pid=71605) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 16:27:36 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] deallocate_for_instance() {{(pid=71605) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 16:27:37 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:27:37 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Took 0.47 seconds to deallocate network for instance. Apr 20 16:27:37 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0490de03-4392-4e4f-b580-c5e68c6230ae tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:27:37 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0490de03-4392-4e4f-b580-c5e68c6230ae tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:27:37 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-0490de03-4392-4e4f-b580-c5e68c6230ae tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:27:37 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-0490de03-4392-4e4f-b580-c5e68c6230ae tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:27:37 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0490de03-4392-4e4f-b580-c5e68c6230ae tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.119s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:27:37 user nova-compute[71605]: INFO nova.scheduler.client.report [None req-0490de03-4392-4e4f-b580-c5e68c6230ae tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Deleted allocations for instance b3efdbb9-b302-44db-bba1-32181ad4e70d Apr 20 16:27:37 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-0490de03-4392-4e4f-b580-c5e68c6230ae tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "b3efdbb9-b302-44db-bba1-32181ad4e70d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.452s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:27:38 user nova-compute[71605]: DEBUG nova.compute.manager [req-8b1b3f68-edaf-4d39-8423-b1263b5709a2 req-0304a67a-91e0-4fc7-be7a-99562813eb13 service nova] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Received event network-vif-plugged-ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:27:38 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-8b1b3f68-edaf-4d39-8423-b1263b5709a2 req-0304a67a-91e0-4fc7-be7a-99562813eb13 service nova] Acquiring lock "b3efdbb9-b302-44db-bba1-32181ad4e70d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:27:38 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-8b1b3f68-edaf-4d39-8423-b1263b5709a2 req-0304a67a-91e0-4fc7-be7a-99562813eb13 service nova] Lock "b3efdbb9-b302-44db-bba1-32181ad4e70d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:27:38 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-8b1b3f68-edaf-4d39-8423-b1263b5709a2 req-0304a67a-91e0-4fc7-be7a-99562813eb13 service nova] Lock "b3efdbb9-b302-44db-bba1-32181ad4e70d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:27:38 user nova-compute[71605]: DEBUG nova.compute.manager [req-8b1b3f68-edaf-4d39-8423-b1263b5709a2 req-0304a67a-91e0-4fc7-be7a-99562813eb13 service nova] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] No waiting events found dispatching network-vif-plugged-ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729 {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:27:38 user nova-compute[71605]: WARNING nova.compute.manager [req-8b1b3f68-edaf-4d39-8423-b1263b5709a2 req-0304a67a-91e0-4fc7-be7a-99562813eb13 service nova] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Received unexpected event network-vif-plugged-ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729 for instance with vm_state deleted and task_state None. Apr 20 16:27:38 user nova-compute[71605]: DEBUG nova.compute.manager [req-8b1b3f68-edaf-4d39-8423-b1263b5709a2 req-0304a67a-91e0-4fc7-be7a-99562813eb13 service nova] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Received event network-vif-deleted-ccb48d0c-5b4b-4af7-b81b-a0eb20e0a729 {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:27:41 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:27:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:27:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:27:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:27:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:27:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:27:46 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:27:51 user nova-compute[71605]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:27:51 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] VM Stopped (Lifecycle Event) Apr 20 16:27:51 user nova-compute[71605]: DEBUG nova.compute.manager [None req-3e182e72-fd2f-44aa-a30d-f1208ffd2fa3 None None] [instance: b3efdbb9-b302-44db-bba1-32181ad4e70d] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:27:51 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:27:56 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:27:58 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:27:58 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:27:58 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:27:58 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:27:58 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:27:58 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:27:58 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:27:58 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:27:58 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=9196MB free_disk=26.302379608154297GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:27:58 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:27:58 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:27:58 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:27:58 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:27:58 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:27:58 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:27:58 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:27:58 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.150s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:28:00 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:28:00 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:28:01 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:28:02 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:28:02 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:28:02 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Rebuilding the list of instances to heal {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 16:28:02 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Didn't find any instances for network info cache update. {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 20 16:28:02 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:28:02 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:28:03 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:28:06 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:28:07 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:28:08 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:28:11 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:28:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:28:21 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:28:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:28:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:28:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:28:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:28:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:28:26 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:28:30 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Acquiring lock "2c4e2b86-7582-425a-9f8c-15e7bbaefb71" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:28:30 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "2c4e2b86-7582-425a-9f8c-15e7bbaefb71" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:28:30 user nova-compute[71605]: DEBUG nova.compute.manager [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Starting instance... {{(pid=71605) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 20 16:28:30 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:28:30 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:28:30 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71605) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 20 16:28:30 user nova-compute[71605]: INFO nova.compute.claims [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Claim successful on node user Apr 20 16:28:30 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:28:30 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:28:30 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.200s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:28:30 user nova-compute[71605]: DEBUG nova.compute.manager [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Start building networks asynchronously for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 20 16:28:30 user nova-compute[71605]: DEBUG nova.compute.manager [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Allocating IP information in the background. {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 20 16:28:30 user nova-compute[71605]: DEBUG nova.network.neutron [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] allocate_for_instance() {{(pid=71605) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 20 16:28:30 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 20 16:28:30 user nova-compute[71605]: DEBUG nova.compute.manager [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Start building block device mappings for instance. {{(pid=71605) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 20 16:28:30 user nova-compute[71605]: DEBUG nova.policy [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f4ec42233fc040d0bef4f2e408a561b7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1558b18ec4304868ad6d8c61b5525d55', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71605) authorize /opt/stack/nova/nova/policy.py:203}} Apr 20 16:28:30 user nova-compute[71605]: DEBUG nova.compute.manager [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Start spawning the instance on the hypervisor. {{(pid=71605) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 20 16:28:30 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Creating instance directory {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 20 16:28:30 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Creating image(s) Apr 20 16:28:30 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Acquiring lock "/opt/stack/data/nova/instances/2c4e2b86-7582-425a-9f8c-15e7bbaefb71/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:28:30 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "/opt/stack/data/nova/instances/2c4e2b86-7582-425a-9f8c-15e7bbaefb71/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:28:30 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "/opt/stack/data/nova/instances/2c4e2b86-7582-425a-9f8c-15e7bbaefb71/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:28:30 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Acquiring lock "ba661e0fcad9fad5b82e541158dacc04eee806c3" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:28:30 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "ba661e0fcad9fad5b82e541158dacc04eee806c3" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:28:30 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ba661e0fcad9fad5b82e541158dacc04eee806c3.part --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:28:31 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ba661e0fcad9fad5b82e541158dacc04eee806c3.part --force-share --output=json" returned: 0 in 0.130s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:28:31 user nova-compute[71605]: DEBUG nova.virt.images [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] d41eddb3-3b65-456d-ab27-e42a9678ef7d was qcow2, converting to raw {{(pid=71605) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 20 16:28:31 user nova-compute[71605]: DEBUG nova.privsep.utils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71605) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 20 16:28:31 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/ba661e0fcad9fad5b82e541158dacc04eee806c3.part /opt/stack/data/nova/instances/_base/ba661e0fcad9fad5b82e541158dacc04eee806c3.converted {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:28:31 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/ba661e0fcad9fad5b82e541158dacc04eee806c3.part /opt/stack/data/nova/instances/_base/ba661e0fcad9fad5b82e541158dacc04eee806c3.converted" returned: 0 in 0.222s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:28:31 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ba661e0fcad9fad5b82e541158dacc04eee806c3.converted --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:28:31 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ba661e0fcad9fad5b82e541158dacc04eee806c3.converted --force-share --output=json" returned: 0 in 0.132s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:28:31 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "ba661e0fcad9fad5b82e541158dacc04eee806c3" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.775s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:28:31 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ba661e0fcad9fad5b82e541158dacc04eee806c3 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:28:31 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ba661e0fcad9fad5b82e541158dacc04eee806c3 --force-share --output=json" returned: 0 in 0.138s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:28:31 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Acquiring lock "ba661e0fcad9fad5b82e541158dacc04eee806c3" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:28:31 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "ba661e0fcad9fad5b82e541158dacc04eee806c3" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:28:31 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ba661e0fcad9fad5b82e541158dacc04eee806c3 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:28:31 user nova-compute[71605]: DEBUG nova.network.neutron [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Successfully created port: 69123f68-1692-4e10-9cee-e15040c7638f {{(pid=71605) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 20 16:28:31 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ba661e0fcad9fad5b82e541158dacc04eee806c3 --force-share --output=json" returned: 0 in 0.141s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:28:31 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/ba661e0fcad9fad5b82e541158dacc04eee806c3,backing_fmt=raw /opt/stack/data/nova/instances/2c4e2b86-7582-425a-9f8c-15e7bbaefb71/disk 1073741824 {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:28:31 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/ba661e0fcad9fad5b82e541158dacc04eee806c3,backing_fmt=raw /opt/stack/data/nova/instances/2c4e2b86-7582-425a-9f8c-15e7bbaefb71/disk 1073741824" returned: 0 in 0.051s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:28:31 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "ba661e0fcad9fad5b82e541158dacc04eee806c3" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.197s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:28:31 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ba661e0fcad9fad5b82e541158dacc04eee806c3 --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:28:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:28:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:28:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5007 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:28:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:28:31 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:28:31 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/ba661e0fcad9fad5b82e541158dacc04eee806c3 --force-share --output=json" returned: 0 in 0.135s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:28:31 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Checking if we can resize image /opt/stack/data/nova/instances/2c4e2b86-7582-425a-9f8c-15e7bbaefb71/disk. size=1073741824 {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 20 16:28:31 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c4e2b86-7582-425a-9f8c-15e7bbaefb71/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c4e2b86-7582-425a-9f8c-15e7bbaefb71/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.virt.disk.api [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Cannot resize image /opt/stack/data/nova/instances/2c4e2b86-7582-425a-9f8c-15e7bbaefb71/disk to a smaller size. {{(pid=71605) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.objects.instance [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lazy-loading 'migration_context' on Instance uuid 2c4e2b86-7582-425a-9f8c-15e7bbaefb71 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Created local disks {{(pid=71605) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Ensure instance console log exists: /opt/stack/data/nova/instances/2c4e2b86-7582-425a-9f8c-15e7bbaefb71/console.log {{(pid=71605) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.network.neutron [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Successfully updated port: 69123f68-1692-4e10-9cee-e15040c7638f {{(pid=71605) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Acquiring lock "refresh_cache-2c4e2b86-7582-425a-9f8c-15e7bbaefb71" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Acquired lock "refresh_cache-2c4e2b86-7582-425a-9f8c-15e7bbaefb71" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.network.neutron [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Building network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.compute.manager [req-36d66d27-749b-4aef-894f-9bdf1bfa1803 req-85cc62c8-548c-42e8-a5eb-39a87bc779c7 service nova] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Received event network-changed-69123f68-1692-4e10-9cee-e15040c7638f {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.compute.manager [req-36d66d27-749b-4aef-894f-9bdf1bfa1803 req-85cc62c8-548c-42e8-a5eb-39a87bc779c7 service nova] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Refreshing instance network info cache due to event network-changed-69123f68-1692-4e10-9cee-e15040c7638f. {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-36d66d27-749b-4aef-894f-9bdf1bfa1803 req-85cc62c8-548c-42e8-a5eb-39a87bc779c7 service nova] Acquiring lock "refresh_cache-2c4e2b86-7582-425a-9f8c-15e7bbaefb71" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.network.neutron [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Instance cache missing network info. {{(pid=71605) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.network.neutron [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Updating instance_info_cache with network_info: [{"id": "69123f68-1692-4e10-9cee-e15040c7638f", "address": "fa:16:3e:ff:f7:28", "network": {"id": "b19664d3-6727-47cf-81c3-ee6ea3992cb8", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-498816505-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1558b18ec4304868ad6d8c61b5525d55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69123f68-16", "ovs_interfaceid": "69123f68-1692-4e10-9cee-e15040c7638f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Releasing lock "refresh_cache-2c4e2b86-7582-425a-9f8c-15e7bbaefb71" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.compute.manager [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Instance network_info: |[{"id": "69123f68-1692-4e10-9cee-e15040c7638f", "address": "fa:16:3e:ff:f7:28", "network": {"id": "b19664d3-6727-47cf-81c3-ee6ea3992cb8", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-498816505-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1558b18ec4304868ad6d8c61b5525d55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69123f68-16", "ovs_interfaceid": "69123f68-1692-4e10-9cee-e15040c7638f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71605) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-36d66d27-749b-4aef-894f-9bdf1bfa1803 req-85cc62c8-548c-42e8-a5eb-39a87bc779c7 service nova] Acquired lock "refresh_cache-2c4e2b86-7582-425a-9f8c-15e7bbaefb71" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.network.neutron [req-36d66d27-749b-4aef-894f-9bdf1bfa1803 req-85cc62c8-548c-42e8-a5eb-39a87bc779c7 service nova] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Refreshing network info cache for port 69123f68-1692-4e10-9cee-e15040c7638f {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Start _get_guest_xml network_info=[{"id": "69123f68-1692-4e10-9cee-e15040c7638f", "address": "fa:16:3e:ff:f7:28", "network": {"id": "b19664d3-6727-47cf-81c3-ee6ea3992cb8", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-498816505-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1558b18ec4304868ad6d8c61b5525d55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69123f68-16", "ovs_interfaceid": "69123f68-1692-4e10-9cee-e15040c7638f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T16:28:27Z,direct_url=,disk_format='qcow2',id=d41eddb3-3b65-456d-ab27-e42a9678ef7d,min_disk=0,min_ram=0,name='tempest-scenario-img--818931373',owner='1558b18ec4304868ad6d8c61b5525d55',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T16:28:29Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_secret_uuid': None, 'encryption_options': None, 'device_type': 'disk', 'encrypted': False, 'size': 0, 'encryption_format': None, 'guest_format': None, 'device_name': '/dev/vda', 'disk_bus': 'virtio', 'image_id': 'd41eddb3-3b65-456d-ab27-e42a9678ef7d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 20 16:28:32 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:28:32 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71605) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-20T16:00:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-20T16:28:27Z,direct_url=,disk_format='qcow2',id=d41eddb3-3b65-456d-ab27-e42a9678ef7d,min_disk=0,min_ram=0,name='tempest-scenario-img--818931373',owner='1558b18ec4304868ad6d8c61b5525d55',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-20T16:28:29Z,virtual_size=,visibility=), allow threads: True {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Flavor limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Image limits 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Flavor pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Image pref 0:0:0 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71605) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Got 1 possible topologies {{(pid=71605) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.virt.hardware [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71605) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:28:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1486305246',display_name='tempest-TestMinimumBasicScenario-server-1486305246',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1486305246',id=25,image_ref='d41eddb3-3b65-456d-ab27-e42a9678ef7d',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPiZZWNvgI17HKNI+BXt+KB+YEhHbL07acsxespKHluya8pbshsGDbeReZUbLVLN3F7mDKtBiKIV6ZgsjnVVog16/FMqASI+9Nj8b/dyFgrEQl+sLNa7cvuPjZiurAj63Q==',key_name='tempest-TestMinimumBasicScenario-2042608265',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1558b18ec4304868ad6d8c61b5525d55',ramdisk_id='',reservation_id='r-h7yi0pza',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d41eddb3-3b65-456d-ab27-e42a9678ef7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-1763718283',owner_user_name='tempest-TestMinimumBasicScenario-1763718283-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:28:31Z,user_data=None,user_id='f4ec42233fc040d0bef4f2e408a561b7',uuid=2c4e2b86-7582-425a-9f8c-15e7bbaefb71,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "69123f68-1692-4e10-9cee-e15040c7638f", "address": "fa:16:3e:ff:f7:28", "network": {"id": "b19664d3-6727-47cf-81c3-ee6ea3992cb8", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-498816505-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1558b18ec4304868ad6d8c61b5525d55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69123f68-16", "ovs_interfaceid": "69123f68-1692-4e10-9cee-e15040c7638f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71605) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Converting VIF {"id": "69123f68-1692-4e10-9cee-e15040c7638f", "address": "fa:16:3e:ff:f7:28", "network": {"id": "b19664d3-6727-47cf-81c3-ee6ea3992cb8", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-498816505-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1558b18ec4304868ad6d8c61b5525d55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69123f68-16", "ovs_interfaceid": "69123f68-1692-4e10-9cee-e15040c7638f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:f7:28,bridge_name='br-int',has_traffic_filtering=True,id=69123f68-1692-4e10-9cee-e15040c7638f,network=Network(b19664d3-6727-47cf-81c3-ee6ea3992cb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69123f68-16') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.objects.instance [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lazy-loading 'pci_devices' on Instance uuid 2c4e2b86-7582-425a-9f8c-15e7bbaefb71 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] End _get_guest_xml xml= Apr 20 16:28:32 user nova-compute[71605]: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71 Apr 20 16:28:32 user nova-compute[71605]: instance-00000019 Apr 20 16:28:32 user nova-compute[71605]: 131072 Apr 20 16:28:32 user nova-compute[71605]: 1 Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: tempest-TestMinimumBasicScenario-server-1486305246 Apr 20 16:28:32 user nova-compute[71605]: 2023-04-20 16:28:32 Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: 128 Apr 20 16:28:32 user nova-compute[71605]: 1 Apr 20 16:28:32 user nova-compute[71605]: 0 Apr 20 16:28:32 user nova-compute[71605]: 0 Apr 20 16:28:32 user nova-compute[71605]: 1 Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: tempest-TestMinimumBasicScenario-1763718283-project-member Apr 20 16:28:32 user nova-compute[71605]: tempest-TestMinimumBasicScenario-1763718283 Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: OpenStack Foundation Apr 20 16:28:32 user nova-compute[71605]: OpenStack Nova Apr 20 16:28:32 user nova-compute[71605]: 0.0.0 Apr 20 16:28:32 user nova-compute[71605]: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71 Apr 20 16:28:32 user nova-compute[71605]: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71 Apr 20 16:28:32 user nova-compute[71605]: Virtual Machine Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: hvm Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Nehalem Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: /dev/urandom Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: Apr 20 16:28:32 user nova-compute[71605]: {{(pid=71605) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:28:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1486305246',display_name='tempest-TestMinimumBasicScenario-server-1486305246',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1486305246',id=25,image_ref='d41eddb3-3b65-456d-ab27-e42a9678ef7d',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPiZZWNvgI17HKNI+BXt+KB+YEhHbL07acsxespKHluya8pbshsGDbeReZUbLVLN3F7mDKtBiKIV6ZgsjnVVog16/FMqASI+9Nj8b/dyFgrEQl+sLNa7cvuPjZiurAj63Q==',key_name='tempest-TestMinimumBasicScenario-2042608265',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1558b18ec4304868ad6d8c61b5525d55',ramdisk_id='',reservation_id='r-h7yi0pza',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d41eddb3-3b65-456d-ab27-e42a9678ef7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-1763718283',owner_user_name='tempest-TestMinimumBasicScenario-1763718283-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-20T16:28:31Z,user_data=None,user_id='f4ec42233fc040d0bef4f2e408a561b7',uuid=2c4e2b86-7582-425a-9f8c-15e7bbaefb71,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "69123f68-1692-4e10-9cee-e15040c7638f", "address": "fa:16:3e:ff:f7:28", "network": {"id": "b19664d3-6727-47cf-81c3-ee6ea3992cb8", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-498816505-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1558b18ec4304868ad6d8c61b5525d55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69123f68-16", "ovs_interfaceid": "69123f68-1692-4e10-9cee-e15040c7638f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Converting VIF {"id": "69123f68-1692-4e10-9cee-e15040c7638f", "address": "fa:16:3e:ff:f7:28", "network": {"id": "b19664d3-6727-47cf-81c3-ee6ea3992cb8", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-498816505-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1558b18ec4304868ad6d8c61b5525d55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69123f68-16", "ovs_interfaceid": "69123f68-1692-4e10-9cee-e15040c7638f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:f7:28,bridge_name='br-int',has_traffic_filtering=True,id=69123f68-1692-4e10-9cee-e15040c7638f,network=Network(b19664d3-6727-47cf-81c3-ee6ea3992cb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69123f68-16') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG os_vif [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:f7:28,bridge_name='br-int',has_traffic_filtering=True,id=69123f68-1692-4e10-9cee-e15040c7638f,network=Network(b19664d3-6727-47cf-81c3-ee6ea3992cb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69123f68-16') {{(pid=71605) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap69123f68-16, may_exist=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap69123f68-16, col_values=(('external_ids', {'iface-id': '69123f68-1692-4e10-9cee-e15040c7638f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ff:f7:28', 'vm-uuid': '2c4e2b86-7582-425a-9f8c-15e7bbaefb71'}),)) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:28:32 user nova-compute[71605]: INFO os_vif [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:f7:28,bridge_name='br-int',has_traffic_filtering=True,id=69123f68-1692-4e10-9cee-e15040c7638f,network=Network(b19664d3-6727-47cf-81c3-ee6ea3992cb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69123f68-16') Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] No BDM found with device name vda, not building metadata. {{(pid=71605) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 20 16:28:32 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] No VIF found with MAC fa:16:3e:ff:f7:28, not building metadata {{(pid=71605) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 20 16:28:33 user nova-compute[71605]: DEBUG nova.network.neutron [req-36d66d27-749b-4aef-894f-9bdf1bfa1803 req-85cc62c8-548c-42e8-a5eb-39a87bc779c7 service nova] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Updated VIF entry in instance network info cache for port 69123f68-1692-4e10-9cee-e15040c7638f. {{(pid=71605) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 20 16:28:33 user nova-compute[71605]: DEBUG nova.network.neutron [req-36d66d27-749b-4aef-894f-9bdf1bfa1803 req-85cc62c8-548c-42e8-a5eb-39a87bc779c7 service nova] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Updating instance_info_cache with network_info: [{"id": "69123f68-1692-4e10-9cee-e15040c7638f", "address": "fa:16:3e:ff:f7:28", "network": {"id": "b19664d3-6727-47cf-81c3-ee6ea3992cb8", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-498816505-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1558b18ec4304868ad6d8c61b5525d55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69123f68-16", "ovs_interfaceid": "69123f68-1692-4e10-9cee-e15040c7638f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:28:33 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-36d66d27-749b-4aef-894f-9bdf1bfa1803 req-85cc62c8-548c-42e8-a5eb-39a87bc779c7 service nova] Releasing lock "refresh_cache-2c4e2b86-7582-425a-9f8c-15e7bbaefb71" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:28:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:28:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:28:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:28:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:28:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:28:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:28:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:28:34 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:28:35 user nova-compute[71605]: DEBUG nova.compute.manager [req-ea2dade3-cb74-40db-945a-1f2b2578fe6a req-e5761fc0-19aa-440d-91b0-99243c6278f1 service nova] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Received event network-vif-plugged-69123f68-1692-4e10-9cee-e15040c7638f {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:28:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-ea2dade3-cb74-40db-945a-1f2b2578fe6a req-e5761fc0-19aa-440d-91b0-99243c6278f1 service nova] Acquiring lock "2c4e2b86-7582-425a-9f8c-15e7bbaefb71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:28:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-ea2dade3-cb74-40db-945a-1f2b2578fe6a req-e5761fc0-19aa-440d-91b0-99243c6278f1 service nova] Lock "2c4e2b86-7582-425a-9f8c-15e7bbaefb71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:28:35 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-ea2dade3-cb74-40db-945a-1f2b2578fe6a req-e5761fc0-19aa-440d-91b0-99243c6278f1 service nova] Lock "2c4e2b86-7582-425a-9f8c-15e7bbaefb71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:28:35 user nova-compute[71605]: DEBUG nova.compute.manager [req-ea2dade3-cb74-40db-945a-1f2b2578fe6a req-e5761fc0-19aa-440d-91b0-99243c6278f1 service nova] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] No waiting events found dispatching network-vif-plugged-69123f68-1692-4e10-9cee-e15040c7638f {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:28:35 user nova-compute[71605]: WARNING nova.compute.manager [req-ea2dade3-cb74-40db-945a-1f2b2578fe6a req-e5761fc0-19aa-440d-91b0-99243c6278f1 service nova] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Received unexpected event network-vif-plugged-69123f68-1692-4e10-9cee-e15040c7638f for instance with vm_state building and task_state spawning. Apr 20 16:28:36 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Resumed> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:28:36 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] VM Resumed (Lifecycle Event) Apr 20 16:28:36 user nova-compute[71605]: DEBUG nova.compute.manager [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Instance event wait completed in 0 seconds for {{(pid=71605) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 20 16:28:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Guest created on hypervisor {{(pid=71605) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 20 16:28:36 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Instance spawned successfully. Apr 20 16:28:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 20 16:28:36 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:28:36 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:28:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Found default for hw_cdrom_bus of ide {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:28:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Found default for hw_disk_bus of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:28:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Found default for hw_input_bus of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:28:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Found default for hw_pointer_model of None {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:28:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Found default for hw_video_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:28:36 user nova-compute[71605]: DEBUG nova.virt.libvirt.driver [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Found default for hw_vif_model of virtio {{(pid=71605) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 20 16:28:36 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:28:36 user nova-compute[71605]: DEBUG nova.virt.driver [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] Emitting event Started> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:28:36 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] VM Started (Lifecycle Event) Apr 20 16:28:36 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:28:36 user nova-compute[71605]: DEBUG nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71605) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 20 16:28:36 user nova-compute[71605]: INFO nova.compute.manager [None req-ec05cd21-1c35-454a-9754-a22885e21533 None None] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] During sync_power_state the instance has a pending task (spawning). Skip. Apr 20 16:28:36 user nova-compute[71605]: INFO nova.compute.manager [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Took 5.80 seconds to spawn the instance on the hypervisor. Apr 20 16:28:36 user nova-compute[71605]: DEBUG nova.compute.manager [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:28:36 user nova-compute[71605]: INFO nova.compute.manager [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Took 6.29 seconds to build instance. Apr 20 16:28:36 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-a9604a8b-fab9-4af1-9d67-5182007b2b52 tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "2c4e2b86-7582-425a-9f8c-15e7bbaefb71" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.382s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:28:37 user nova-compute[71605]: DEBUG nova.compute.manager [req-b97759b7-2b8a-44a4-a30f-a1653e1c1911 req-a0d2aefd-4ce9-46b6-8d2f-0fa49e024e0f service nova] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Received event network-vif-plugged-69123f68-1692-4e10-9cee-e15040c7638f {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:28:37 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-b97759b7-2b8a-44a4-a30f-a1653e1c1911 req-a0d2aefd-4ce9-46b6-8d2f-0fa49e024e0f service nova] Acquiring lock "2c4e2b86-7582-425a-9f8c-15e7bbaefb71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:28:37 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-b97759b7-2b8a-44a4-a30f-a1653e1c1911 req-a0d2aefd-4ce9-46b6-8d2f-0fa49e024e0f service nova] Lock "2c4e2b86-7582-425a-9f8c-15e7bbaefb71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:28:37 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-b97759b7-2b8a-44a4-a30f-a1653e1c1911 req-a0d2aefd-4ce9-46b6-8d2f-0fa49e024e0f service nova] Lock "2c4e2b86-7582-425a-9f8c-15e7bbaefb71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:28:37 user nova-compute[71605]: DEBUG nova.compute.manager [req-b97759b7-2b8a-44a4-a30f-a1653e1c1911 req-a0d2aefd-4ce9-46b6-8d2f-0fa49e024e0f service nova] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] No waiting events found dispatching network-vif-plugged-69123f68-1692-4e10-9cee-e15040c7638f {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:28:37 user nova-compute[71605]: WARNING nova.compute.manager [req-b97759b7-2b8a-44a4-a30f-a1653e1c1911 req-a0d2aefd-4ce9-46b6-8d2f-0fa49e024e0f service nova] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Received unexpected event network-vif-plugged-69123f68-1692-4e10-9cee-e15040c7638f for instance with vm_state active and task_state None. Apr 20 16:28:37 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:28:39 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:28:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:28:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:28:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:28:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:28:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:28:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:28:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:28:49 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:28:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:28:54 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:28:57 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:28:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:28:58 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:28:58 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:28:58 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:28:58 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:28:58 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:28:58 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c4e2b86-7582-425a-9f8c-15e7bbaefb71/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:28:58 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c4e2b86-7582-425a-9f8c-15e7bbaefb71/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:28:58 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c4e2b86-7582-425a-9f8c-15e7bbaefb71/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:28:58 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c4e2b86-7582-425a-9f8c-15e7bbaefb71/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:28:59 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:28:59 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:28:59 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=9022MB free_disk=26.265037536621094GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:28:59 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:28:59 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:28:59 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance 2c4e2b86-7582-425a-9f8c-15e7bbaefb71 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:28:59 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:28:59 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:28:59 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:28:59 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:28:59 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:28:59 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.191s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:28:59 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:29:00 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:29:02 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:29:02 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:29:02 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:29:02 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:29:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:29:03 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:29:04 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:29:04 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:29:04 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:29:04 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Rebuilding the list of instances to heal {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 16:29:04 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "refresh_cache-2c4e2b86-7582-425a-9f8c-15e7bbaefb71" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:29:04 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquired lock "refresh_cache-2c4e2b86-7582-425a-9f8c-15e7bbaefb71" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:29:04 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Forcefully refreshing network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 16:29:04 user nova-compute[71605]: DEBUG nova.objects.instance [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lazy-loading 'info_cache' on Instance uuid 2c4e2b86-7582-425a-9f8c-15e7bbaefb71 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:29:04 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:29:04 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Updating instance_info_cache with network_info: [{"id": "69123f68-1692-4e10-9cee-e15040c7638f", "address": "fa:16:3e:ff:f7:28", "network": {"id": "b19664d3-6727-47cf-81c3-ee6ea3992cb8", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-498816505-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1558b18ec4304868ad6d8c61b5525d55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69123f68-16", "ovs_interfaceid": "69123f68-1692-4e10-9cee-e15040c7638f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:29:04 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Releasing lock "refresh_cache-2c4e2b86-7582-425a-9f8c-15e7bbaefb71" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:29:04 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Updated the network info_cache for instance {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 16:29:04 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:29:07 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:29:08 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:29:10 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:29:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:29:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:29:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:29:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:29:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:29:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:29:16 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:29:16 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Cleaning up deleted instances {{(pid=71605) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 20 16:29:16 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] There are 0 instances to clean {{(pid=71605) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 20 16:29:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:29:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:29:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:29:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:29:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:29:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:29:19 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:29:21 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._sync_power_states {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:29:21 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Triggering sync for uuid 2c4e2b86-7582-425a-9f8c-15e7bbaefb71 {{(pid=71605) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 20 16:29:21 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "2c4e2b86-7582-425a-9f8c-15e7bbaefb71" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:29:21 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "2c4e2b86-7582-425a-9f8c-15e7bbaefb71" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:29:21 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "2c4e2b86-7582-425a-9f8c-15e7bbaefb71" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.025s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:29:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:29:23 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:29:23 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Cleaning up deleted instances with incomplete migration {{(pid=71605) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 20 16:29:27 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:29:27 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:29:27 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:29:27 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:29:27 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:29:27 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:29:32 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:29:37 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:29:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:29:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:29:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:29:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:29:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:29:42 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:29:44 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:29:47 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:29:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:29:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:29:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:29:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:29:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:29:52 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:29:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:29:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:29:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:29:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:29:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:29:57 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:29:58 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:29:58 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:29:58 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:29:58 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:29:58 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:29:58 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c4e2b86-7582-425a-9f8c-15e7bbaefb71/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:29:58 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c4e2b86-7582-425a-9f8c-15e7bbaefb71/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:29:58 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c4e2b86-7582-425a-9f8c-15e7bbaefb71/disk --force-share --output=json {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 20 16:29:58 user nova-compute[71605]: DEBUG oslo_concurrency.processutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2c4e2b86-7582-425a-9f8c-15e7bbaefb71/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71605) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 20 16:29:59 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:29:59 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:29:59 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=9080MB free_disk=26.264328002929688GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:29:59 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:29:59 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:29:59 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Instance 2c4e2b86-7582-425a-9f8c-15e7bbaefb71 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71605) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 20 16:29:59 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:29:59 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:29:59 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Refreshing inventories for resource provider 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 20 16:29:59 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Updating ProviderTree inventory for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 20 16:29:59 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Updating inventory in ProviderTree for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 20 16:29:59 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Refreshing aggregate associations for resource provider 00e9f769-1a1c-4f1e-80e4-b19657803102, aggregates: None {{(pid=71605) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 20 16:29:59 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Refreshing trait associations for resource provider 00e9f769-1a1c-4f1e-80e4-b19657803102, traits: COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE2,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_MMX,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI {{(pid=71605) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 20 16:29:59 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:29:59 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:29:59 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:29:59 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.365s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:30:01 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:30:02 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:30:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:30:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:30:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:30:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:30:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:30:02 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:30:04 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:30:04 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:30:04 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:30:04 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:30:06 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:30:06 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:30:06 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Rebuilding the list of instances to heal {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 16:30:06 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "refresh_cache-2c4e2b86-7582-425a-9f8c-15e7bbaefb71" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 20 16:30:06 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquired lock "refresh_cache-2c4e2b86-7582-425a-9f8c-15e7bbaefb71" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 20 16:30:06 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Forcefully refreshing network info cache for instance {{(pid=71605) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 20 16:30:06 user nova-compute[71605]: DEBUG nova.objects.instance [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lazy-loading 'info_cache' on Instance uuid 2c4e2b86-7582-425a-9f8c-15e7bbaefb71 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:30:06 user nova-compute[71605]: DEBUG nova.network.neutron [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Updating instance_info_cache with network_info: [{"id": "69123f68-1692-4e10-9cee-e15040c7638f", "address": "fa:16:3e:ff:f7:28", "network": {"id": "b19664d3-6727-47cf-81c3-ee6ea3992cb8", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-498816505-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1558b18ec4304868ad6d8c61b5525d55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69123f68-16", "ovs_interfaceid": "69123f68-1692-4e10-9cee-e15040c7638f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:30:06 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Releasing lock "refresh_cache-2c4e2b86-7582-425a-9f8c-15e7bbaefb71" {{(pid=71605) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 20 16:30:06 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Updated the network info_cache for instance {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 20 16:30:07 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:30:10 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:30:11 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:30:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:30:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:30:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:30:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:30:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:30:12 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:30:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:30:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:30:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:30:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:30:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:30:17 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:30:22 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-14297b38-c7a8-48e6-940e-ca121ff91c5c tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Acquiring lock "2c4e2b86-7582-425a-9f8c-15e7bbaefb71" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:30:22 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-14297b38-c7a8-48e6-940e-ca121ff91c5c tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "2c4e2b86-7582-425a-9f8c-15e7bbaefb71" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:30:22 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-14297b38-c7a8-48e6-940e-ca121ff91c5c tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Acquiring lock "2c4e2b86-7582-425a-9f8c-15e7bbaefb71-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:30:22 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-14297b38-c7a8-48e6-940e-ca121ff91c5c tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "2c4e2b86-7582-425a-9f8c-15e7bbaefb71-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:30:22 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-14297b38-c7a8-48e6-940e-ca121ff91c5c tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "2c4e2b86-7582-425a-9f8c-15e7bbaefb71-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:30:22 user nova-compute[71605]: INFO nova.compute.manager [None req-14297b38-c7a8-48e6-940e-ca121ff91c5c tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Terminating instance Apr 20 16:30:22 user nova-compute[71605]: DEBUG nova.compute.manager [None req-14297b38-c7a8-48e6-940e-ca121ff91c5c tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Start destroying the instance on the hypervisor. {{(pid=71605) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 20 16:30:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:30:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:30:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:30:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:30:22 user nova-compute[71605]: DEBUG nova.compute.manager [req-df14bdc4-4d05-4e05-8a24-ab9f3524673f req-d8485864-2cb9-416f-9ffe-e500b88376ff service nova] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Received event network-vif-unplugged-69123f68-1692-4e10-9cee-e15040c7638f {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:30:22 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-df14bdc4-4d05-4e05-8a24-ab9f3524673f req-d8485864-2cb9-416f-9ffe-e500b88376ff service nova] Acquiring lock "2c4e2b86-7582-425a-9f8c-15e7bbaefb71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:30:22 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-df14bdc4-4d05-4e05-8a24-ab9f3524673f req-d8485864-2cb9-416f-9ffe-e500b88376ff service nova] Lock "2c4e2b86-7582-425a-9f8c-15e7bbaefb71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:30:22 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-df14bdc4-4d05-4e05-8a24-ab9f3524673f req-d8485864-2cb9-416f-9ffe-e500b88376ff service nova] Lock "2c4e2b86-7582-425a-9f8c-15e7bbaefb71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:30:22 user nova-compute[71605]: DEBUG nova.compute.manager [req-df14bdc4-4d05-4e05-8a24-ab9f3524673f req-d8485864-2cb9-416f-9ffe-e500b88376ff service nova] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] No waiting events found dispatching network-vif-unplugged-69123f68-1692-4e10-9cee-e15040c7638f {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:30:22 user nova-compute[71605]: DEBUG nova.compute.manager [req-df14bdc4-4d05-4e05-8a24-ab9f3524673f req-d8485864-2cb9-416f-9ffe-e500b88376ff service nova] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Received event network-vif-unplugged-69123f68-1692-4e10-9cee-e15040c7638f for instance with task_state deleting. {{(pid=71605) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 20 16:30:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:30:22 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:30:23 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:30:23 user nova-compute[71605]: INFO nova.virt.libvirt.driver [-] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Instance destroyed successfully. Apr 20 16:30:23 user nova-compute[71605]: DEBUG nova.objects.instance [None req-14297b38-c7a8-48e6-940e-ca121ff91c5c tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lazy-loading 'resources' on Instance uuid 2c4e2b86-7582-425a-9f8c-15e7bbaefb71 {{(pid=71605) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 20 16:30:23 user nova-compute[71605]: DEBUG nova.virt.libvirt.vif [None req-14297b38-c7a8-48e6-940e-ca121ff91c5c tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-20T16:28:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1486305246',display_name='tempest-TestMinimumBasicScenario-server-1486305246',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1486305246',id=25,image_ref='d41eddb3-3b65-456d-ab27-e42a9678ef7d',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPiZZWNvgI17HKNI+BXt+KB+YEhHbL07acsxespKHluya8pbshsGDbeReZUbLVLN3F7mDKtBiKIV6ZgsjnVVog16/FMqASI+9Nj8b/dyFgrEQl+sLNa7cvuPjZiurAj63Q==',key_name='tempest-TestMinimumBasicScenario-2042608265',keypairs=,launch_index=0,launched_at=2023-04-20T16:28:36Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='1558b18ec4304868ad6d8c61b5525d55',ramdisk_id='',reservation_id='r-h7yi0pza',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='d41eddb3-3b65-456d-ab27-e42a9678ef7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-1763718283',owner_user_name='tempest-TestMinimumBasicScenario-1763718283-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-20T16:28:36Z,user_data=None,user_id='f4ec42233fc040d0bef4f2e408a561b7',uuid=2c4e2b86-7582-425a-9f8c-15e7bbaefb71,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "69123f68-1692-4e10-9cee-e15040c7638f", "address": "fa:16:3e:ff:f7:28", "network": {"id": "b19664d3-6727-47cf-81c3-ee6ea3992cb8", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-498816505-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1558b18ec4304868ad6d8c61b5525d55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69123f68-16", "ovs_interfaceid": "69123f68-1692-4e10-9cee-e15040c7638f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 20 16:30:23 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-14297b38-c7a8-48e6-940e-ca121ff91c5c tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Converting VIF {"id": "69123f68-1692-4e10-9cee-e15040c7638f", "address": "fa:16:3e:ff:f7:28", "network": {"id": "b19664d3-6727-47cf-81c3-ee6ea3992cb8", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-498816505-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "1558b18ec4304868ad6d8c61b5525d55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap69123f68-16", "ovs_interfaceid": "69123f68-1692-4e10-9cee-e15040c7638f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 20 16:30:23 user nova-compute[71605]: DEBUG nova.network.os_vif_util [None req-14297b38-c7a8-48e6-940e-ca121ff91c5c tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ff:f7:28,bridge_name='br-int',has_traffic_filtering=True,id=69123f68-1692-4e10-9cee-e15040c7638f,network=Network(b19664d3-6727-47cf-81c3-ee6ea3992cb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69123f68-16') {{(pid=71605) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 20 16:30:23 user nova-compute[71605]: DEBUG os_vif [None req-14297b38-c7a8-48e6-940e-ca121ff91c5c tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ff:f7:28,bridge_name='br-int',has_traffic_filtering=True,id=69123f68-1692-4e10-9cee-e15040c7638f,network=Network(b19664d3-6727-47cf-81c3-ee6ea3992cb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69123f68-16') {{(pid=71605) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 20 16:30:23 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 19 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:30:23 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap69123f68-16, bridge=br-int, if_exists=True) {{(pid=71605) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 20 16:30:23 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:30:23 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:30:23 user nova-compute[71605]: INFO os_vif [None req-14297b38-c7a8-48e6-940e-ca121ff91c5c tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ff:f7:28,bridge_name='br-int',has_traffic_filtering=True,id=69123f68-1692-4e10-9cee-e15040c7638f,network=Network(b19664d3-6727-47cf-81c3-ee6ea3992cb8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69123f68-16') Apr 20 16:30:23 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-14297b38-c7a8-48e6-940e-ca121ff91c5c tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Deleting instance files /opt/stack/data/nova/instances/2c4e2b86-7582-425a-9f8c-15e7bbaefb71_del Apr 20 16:30:23 user nova-compute[71605]: INFO nova.virt.libvirt.driver [None req-14297b38-c7a8-48e6-940e-ca121ff91c5c tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Deletion of /opt/stack/data/nova/instances/2c4e2b86-7582-425a-9f8c-15e7bbaefb71_del complete Apr 20 16:30:23 user nova-compute[71605]: INFO nova.compute.manager [None req-14297b38-c7a8-48e6-940e-ca121ff91c5c tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Took 0.84 seconds to destroy the instance on the hypervisor. Apr 20 16:30:23 user nova-compute[71605]: DEBUG oslo.service.loopingcall [None req-14297b38-c7a8-48e6-940e-ca121ff91c5c tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71605) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 20 16:30:23 user nova-compute[71605]: DEBUG nova.compute.manager [-] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Deallocating network for instance {{(pid=71605) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 20 16:30:23 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] deallocate_for_instance() {{(pid=71605) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 20 16:30:23 user nova-compute[71605]: DEBUG nova.network.neutron [-] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Updating instance_info_cache with network_info: [] {{(pid=71605) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 20 16:30:23 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Took 0.64 seconds to deallocate network for instance. Apr 20 16:30:24 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-14297b38-c7a8-48e6-940e-ca121ff91c5c tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:30:24 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-14297b38-c7a8-48e6-940e-ca121ff91c5c tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:30:24 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-14297b38-c7a8-48e6-940e-ca121ff91c5c tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:30:24 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-14297b38-c7a8-48e6-940e-ca121ff91c5c tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:30:24 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-14297b38-c7a8-48e6-940e-ca121ff91c5c tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.116s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:30:24 user nova-compute[71605]: INFO nova.scheduler.client.report [None req-14297b38-c7a8-48e6-940e-ca121ff91c5c tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Deleted allocations for instance 2c4e2b86-7582-425a-9f8c-15e7bbaefb71 Apr 20 16:30:24 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-14297b38-c7a8-48e6-940e-ca121ff91c5c tempest-TestMinimumBasicScenario-1763718283 tempest-TestMinimumBasicScenario-1763718283-project-member] Lock "2c4e2b86-7582-425a-9f8c-15e7bbaefb71" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.887s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:30:24 user nova-compute[71605]: DEBUG nova.compute.manager [req-eed132f4-8e8b-41de-aca7-a1b3f0569f7e req-862bfdbe-ad94-4be5-a95e-8fb82d370d69 service nova] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Received event network-vif-plugged-69123f68-1692-4e10-9cee-e15040c7638f {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:30:24 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-eed132f4-8e8b-41de-aca7-a1b3f0569f7e req-862bfdbe-ad94-4be5-a95e-8fb82d370d69 service nova] Acquiring lock "2c4e2b86-7582-425a-9f8c-15e7bbaefb71-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:30:24 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-eed132f4-8e8b-41de-aca7-a1b3f0569f7e req-862bfdbe-ad94-4be5-a95e-8fb82d370d69 service nova] Lock "2c4e2b86-7582-425a-9f8c-15e7bbaefb71-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:30:24 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [req-eed132f4-8e8b-41de-aca7-a1b3f0569f7e req-862bfdbe-ad94-4be5-a95e-8fb82d370d69 service nova] Lock "2c4e2b86-7582-425a-9f8c-15e7bbaefb71-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:30:24 user nova-compute[71605]: DEBUG nova.compute.manager [req-eed132f4-8e8b-41de-aca7-a1b3f0569f7e req-862bfdbe-ad94-4be5-a95e-8fb82d370d69 service nova] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] No waiting events found dispatching network-vif-plugged-69123f68-1692-4e10-9cee-e15040c7638f {{(pid=71605) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 20 16:30:24 user nova-compute[71605]: WARNING nova.compute.manager [req-eed132f4-8e8b-41de-aca7-a1b3f0569f7e req-862bfdbe-ad94-4be5-a95e-8fb82d370d69 service nova] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Received unexpected event network-vif-plugged-69123f68-1692-4e10-9cee-e15040c7638f for instance with vm_state deleted and task_state None. Apr 20 16:30:24 user nova-compute[71605]: DEBUG nova.compute.manager [req-eed132f4-8e8b-41de-aca7-a1b3f0569f7e req-862bfdbe-ad94-4be5-a95e-8fb82d370d69 service nova] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Received event network-vif-deleted-69123f68-1692-4e10-9cee-e15040c7638f {{(pid=71605) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 20 16:30:28 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:30:33 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:30:38 user nova-compute[71605]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71605) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 20 16:30:38 user nova-compute[71605]: INFO nova.compute.manager [-] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] VM Stopped (Lifecycle Event) Apr 20 16:30:38 user nova-compute[71605]: DEBUG nova.compute.manager [None req-fb5528ed-0eea-46b2-8eec-9755f9474c36 None None] [instance: 2c4e2b86-7582-425a-9f8c-15e7bbaefb71] Checking state {{(pid=71605) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 20 16:30:38 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:30:38 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:30:38 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:30:38 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:30:38 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:30:38 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:30:43 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:30:48 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:30:53 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:30:58 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:31:00 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:31:00 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:31:00 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:31:00 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:31:00 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:31:00 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Auditing locally available compute resources for user (node: user) {{(pid=71605) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 20 16:31:00 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:31:00 user nova-compute[71605]: WARNING nova.virt.libvirt.driver [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 20 16:31:00 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Hypervisor/Node resource view: name=user free_ram=9202MB free_disk=26.2755126953125GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71605) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 20 16:31:00 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 20 16:31:00 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 20 16:31:00 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 20 16:31:00 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71605) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 20 16:31:00 user nova-compute[71605]: DEBUG nova.compute.provider_tree [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed in ProviderTree for provider: 00e9f769-1a1c-4f1e-80e4-b19657803102 {{(pid=71605) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 20 16:31:00 user nova-compute[71605]: DEBUG nova.scheduler.client.report [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Inventory has not changed for provider 00e9f769-1a1c-4f1e-80e4-b19657803102 based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71605) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 20 16:31:00 user nova-compute[71605]: DEBUG nova.compute.resource_tracker [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Compute_service record updated for user:user {{(pid=71605) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 20 16:31:00 user nova-compute[71605]: DEBUG oslo_concurrency.lockutils [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.143s {{(pid=71605) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 20 16:31:03 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:31:04 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:31:04 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:31:04 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:31:06 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:31:06 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Starting heal instance info cache {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 20 16:31:06 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Rebuilding the list of instances to heal {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 20 16:31:06 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Didn't find any instances for network info cache update. {{(pid=71605) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 20 16:31:06 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:31:06 user nova-compute[71605]: DEBUG nova.compute.manager [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71605) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 20 16:31:08 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:31:08 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:31:10 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:31:13 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:31:13 user nova-compute[71605]: DEBUG oslo_service.periodic_task [None req-f68d6264-78fa-40c4-81b6-77cd437f7164 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71605) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 20 16:31:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:31:16 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:31:18 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:31:19 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:31:23 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:31:28 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:31:33 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 20 16:31:33 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:31:33 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71605) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 20 16:31:33 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:31:33 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71605) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 20 16:31:33 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 20 16:31:38 user nova-compute[71605]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 {{(pid=71605) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}}