Apr 17 22:04:29 user nova-compute[71972]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. Apr 17 22:04:32 user nova-compute[71972]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=71972) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=71972) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=71972) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} Apr 17 22:04:32 user nova-compute[71972]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.020s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:04:32 user nova-compute[71972]: INFO nova.virt.driver [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] Loading compute driver 'libvirt.LibvirtDriver' Apr 17 22:04:32 user nova-compute[71972]: INFO nova.compute.provider_config [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] Acquiring lock "singleton_lock" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] Acquired lock "singleton_lock" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] Releasing lock "singleton_lock" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] Full set of CONF: {{(pid=71972) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ******************************************************************************** {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] Configuration options gathered from: {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] command line args: ['--config-file', '/etc/nova/nova-cpu.conf'] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] config files: ['/etc/nova/nova-cpu.conf'] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ================================================================================ {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] allow_resize_to_same_host = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] arq_binding_timeout = 300 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] backdoor_port = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] backdoor_socket = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] block_device_allocate_retries = 300 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] block_device_allocate_retries_interval = 5 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cert = self.pem {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] compute_driver = libvirt.LibvirtDriver {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] compute_monitors = [] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] config_dir = [] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] config_drive_format = iso9660 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] config_file = ['/etc/nova/nova-cpu.conf'] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] config_source = [] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] console_host = user {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] control_exchange = nova {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cpu_allocation_ratio = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] daemon = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] debug = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] default_access_ip_network_name = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] default_availability_zone = nova {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] default_ephemeral_format = ext4 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] default_schedule_zone = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] disk_allocation_ratio = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] enable_new_services = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] enabled_apis = ['osapi_compute'] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] enabled_ssl_apis = [] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] flat_injected = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] force_config_drive = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] force_raw_images = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] graceful_shutdown_timeout = 5 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] heal_instance_info_cache_interval = 60 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] host = user {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] initial_disk_allocation_ratio = 1.0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] initial_ram_allocation_ratio = 1.0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] instance_build_timeout = 0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] instance_delete_interval = 300 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] instance_format = [instance: %(uuid)s] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] instance_name_template = instance-%08x {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] instance_usage_audit = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] instance_usage_audit_period = month {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] instances_path = /opt/stack/data/nova/instances {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] internal_service_availability_zone = internal {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] key = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] live_migration_retry_count = 30 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] log_config_append = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] log_dir = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] log_file = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] log_options = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] log_rotate_interval = 1 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] log_rotate_interval_type = days {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] log_rotation_type = none {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] long_rpc_timeout = 1800 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] max_concurrent_builds = 10 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] max_concurrent_live_migrations = 1 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] max_concurrent_snapshots = 5 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] max_local_block_devices = 3 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] max_logfile_count = 30 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] max_logfile_size_mb = 200 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] maximum_instance_delete_attempts = 5 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] metadata_listen = 0.0.0.0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] metadata_listen_port = 8775 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] metadata_workers = 3 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] migrate_max_retries = -1 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] mkisofs_cmd = genisoimage {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] my_block_storage_ip = 10.0.0.210 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] my_ip = 10.0.0.210 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] network_allocate_retries = 0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] osapi_compute_listen = 0.0.0.0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] osapi_compute_listen_port = 8774 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] osapi_compute_unique_server_name_scope = {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] osapi_compute_workers = 3 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] password_length = 12 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] periodic_enable = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] periodic_fuzzy_delay = 60 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] pointer_model = ps2mouse {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] preallocate_images = none {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] publish_errors = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] pybasedir = /opt/stack/nova {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ram_allocation_ratio = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] rate_limit_burst = 0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] rate_limit_except_level = CRITICAL {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] rate_limit_interval = 0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] reboot_timeout = 0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] reclaim_instance_interval = 0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] record = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] reimage_timeout_per_gb = 20 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] report_interval = 10 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] rescue_timeout = 0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] reserved_host_cpus = 0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] reserved_host_disk_mb = 0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] reserved_host_memory_mb = 512 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] reserved_huge_pages = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] resize_confirm_window = 0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] resize_fs_using_block_device = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] resume_guests_state_on_host_boot = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] rpc_response_timeout = 60 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] run_external_periodic_tasks = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] running_deleted_instance_action = reap {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] running_deleted_instance_poll_interval = 1800 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] running_deleted_instance_timeout = 0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] scheduler_instance_sync_interval = 120 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] service_down_time = 60 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] servicegroup_driver = db {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] shelved_offload_time = 0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] shelved_poll_interval = 3600 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] shutdown_timeout = 0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] source_is_ipv6 = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ssl_only = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] state_path = /opt/stack/data/nova {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] sync_power_state_interval = 600 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] sync_power_state_pool_size = 1000 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] syslog_log_facility = LOG_USER {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] tempdir = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] timeout_nbd = 10 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] transport_url = **** {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] update_resources_interval = 0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] use_cow_images = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] use_eventlog = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] use_journal = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] use_json = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] use_rootwrap_daemon = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] use_stderr = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] use_syslog = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vcpu_pin_set = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vif_plugging_is_fatal = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vif_plugging_timeout = 0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] virt_mkfs = [] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] volume_usage_poll_interval = 0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] watch_log_file = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] web = /usr/share/spice-html5 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_concurrency.disable_process_locking = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_concurrency.lock_path = /opt/stack/data/nova {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api.auth_strategy = keystone {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api.compute_link_prefix = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api.dhcp_domain = novalocal {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api.enable_instance_password = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api.glance_link_prefix = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api.instance_list_per_project_cells = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api.list_records_by_skipping_down_cells = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api.local_metadata_per_cell = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api.max_limit = 1000 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api.metadata_cache_expiration = 15 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api.neutron_default_tenant_id = default {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api.use_forwarded_for = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api.use_neutron_default_nets = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api.vendordata_dynamic_targets = [] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api.vendordata_jsonfile_path = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.backend = dogpile.cache.memcached {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.backend_argument = **** {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.config_prefix = cache.oslo {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.dead_timeout = 60.0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.debug_cache_backend = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.enable_retry_client = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.enable_socket_keepalive = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.enabled = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.expiration_time = 600 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.hashclient_retry_attempts = 2 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.hashclient_retry_delay = 1.0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.memcache_dead_retry = 300 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.memcache_password = {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.memcache_pool_maxsize = 10 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.memcache_sasl_enabled = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.memcache_socket_timeout = 1.0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.memcache_username = {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.proxies = [] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.retry_attempts = 2 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.retry_delay = 0.0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.socket_keepalive_count = 1 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.socket_keepalive_idle = 1 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.socket_keepalive_interval = 1 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.tls_allowed_ciphers = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.tls_cafile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.tls_certfile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.tls_enabled = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cache.tls_keyfile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cinder.auth_section = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cinder.auth_type = password {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cinder.cafile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cinder.catalog_info = volumev3::publicURL {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cinder.certfile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cinder.collect_timing = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cinder.cross_az_attach = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cinder.debug = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cinder.endpoint_template = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cinder.http_retries = 3 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cinder.insecure = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cinder.keyfile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cinder.os_region_name = RegionOne {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cinder.split_loggers = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cinder.timeout = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] compute.cpu_dedicated_set = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] compute.cpu_shared_set = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] compute.image_type_exclude_list = [] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] compute.max_concurrent_disk_ops = 0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] compute.max_disk_devices_to_attach = -1 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] compute.resource_provider_association_refresh = 300 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] compute.shutdown_retry_interval = 10 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] conductor.workers = 3 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] console.allowed_origins = [] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] console.ssl_ciphers = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] console.ssl_minimum_version = default {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] consoleauth.token_ttl = 600 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cyborg.cafile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cyborg.certfile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cyborg.collect_timing = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cyborg.connect_retries = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cyborg.connect_retry_delay = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cyborg.endpoint_override = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cyborg.insecure = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cyborg.keyfile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cyborg.max_version = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cyborg.min_version = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cyborg.region_name = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cyborg.service_name = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cyborg.service_type = accelerator {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:32 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cyborg.split_loggers = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cyborg.status_code_retries = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cyborg.status_code_retry_delay = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cyborg.timeout = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] cyborg.version = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] database.backend = sqlalchemy {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] database.connection = **** {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] database.connection_debug = 0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] database.connection_parameters = {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] database.connection_recycle_time = 3600 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] database.connection_trace = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] database.db_inc_retry_interval = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] database.db_max_retries = 20 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] database.db_max_retry_interval = 10 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] database.db_retry_interval = 1 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] database.max_overflow = 50 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] database.max_pool_size = 5 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] database.max_retries = 10 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] database.mysql_enable_ndb = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] database.mysql_wsrep_sync_wait = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] database.pool_timeout = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] database.retry_interval = 10 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] database.slave_connection = **** {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] database.sqlite_synchronous = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api_database.backend = sqlalchemy {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api_database.connection = **** {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api_database.connection_debug = 0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api_database.connection_parameters = {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api_database.connection_recycle_time = 3600 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api_database.connection_trace = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api_database.db_inc_retry_interval = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api_database.db_max_retries = 20 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api_database.db_max_retry_interval = 10 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api_database.db_retry_interval = 1 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api_database.max_overflow = 50 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api_database.max_pool_size = 5 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api_database.max_retries = 10 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api_database.mysql_enable_ndb = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api_database.pool_timeout = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api_database.retry_interval = 10 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api_database.slave_connection = **** {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] api_database.sqlite_synchronous = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] devices.enabled_mdev_types = [] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ephemeral_storage_encryption.enabled = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.api_servers = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.cafile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.certfile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.collect_timing = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.connect_retries = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.connect_retry_delay = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.debug = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.default_trusted_certificate_ids = [] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.enable_certificate_validation = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.enable_rbd_download = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.endpoint_override = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.insecure = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.keyfile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.max_version = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.min_version = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.num_retries = 3 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.rbd_ceph_conf = {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.rbd_connect_timeout = 5 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.rbd_pool = {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.rbd_user = {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.region_name = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.service_name = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.service_type = image {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.split_loggers = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.status_code_retries = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.status_code_retry_delay = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.timeout = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.verify_glance_signatures = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] glance.version = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] guestfs.debug = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] hyperv.config_drive_cdrom = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] hyperv.config_drive_inject_password = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] hyperv.enable_instance_metrics_collection = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] hyperv.enable_remotefx = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] hyperv.instances_path_share = {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] hyperv.iscsi_initiator_list = [] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] hyperv.limit_cpu_features = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] hyperv.power_state_check_timeframe = 60 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] hyperv.use_multipath_io = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] hyperv.volume_attach_retry_count = 10 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] hyperv.vswitch_name = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] mks.enabled = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] image_cache.manager_interval = 2400 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] image_cache.precache_concurrency = 1 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] image_cache.remove_unused_base_images = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] image_cache.subdirectory_name = _base {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ironic.api_max_retries = 60 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ironic.api_retry_interval = 2 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ironic.auth_section = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ironic.auth_type = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ironic.cafile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ironic.certfile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ironic.collect_timing = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ironic.connect_retries = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ironic.connect_retry_delay = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ironic.endpoint_override = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ironic.insecure = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ironic.keyfile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ironic.max_version = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ironic.min_version = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ironic.partition_key = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ironic.peer_list = [] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ironic.region_name = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ironic.serial_console_state_timeout = 10 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ironic.service_name = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ironic.service_type = baremetal {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ironic.split_loggers = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ironic.status_code_retries = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ironic.status_code_retry_delay = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ironic.timeout = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ironic.version = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] key_manager.fixed_key = **** {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] barbican.barbican_api_version = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] barbican.barbican_endpoint = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] barbican.barbican_endpoint_type = public {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] barbican.barbican_region_name = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] barbican.cafile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] barbican.certfile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] barbican.collect_timing = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] barbican.insecure = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] barbican.keyfile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] barbican.number_of_retries = 60 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] barbican.retry_delay = 1 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] barbican.send_service_user_token = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] barbican.split_loggers = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] barbican.timeout = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] barbican.verify_ssl = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] barbican.verify_ssl_path = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] barbican_service_user.auth_section = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] barbican_service_user.auth_type = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] barbican_service_user.cafile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] barbican_service_user.certfile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] barbican_service_user.collect_timing = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] barbican_service_user.insecure = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] barbican_service_user.keyfile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] barbican_service_user.split_loggers = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] barbican_service_user.timeout = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vault.approle_role_id = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vault.approle_secret_id = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vault.cafile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vault.certfile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vault.collect_timing = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vault.insecure = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vault.keyfile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vault.kv_mountpoint = secret {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vault.kv_version = 2 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vault.namespace = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vault.root_token_id = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vault.split_loggers = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vault.ssl_ca_crt_file = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vault.timeout = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vault.use_ssl = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] keystone.cafile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] keystone.certfile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] keystone.collect_timing = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] keystone.connect_retries = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] keystone.connect_retry_delay = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] keystone.endpoint_override = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] keystone.insecure = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] keystone.keyfile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] keystone.max_version = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] keystone.min_version = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] keystone.region_name = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] keystone.service_name = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] keystone.service_type = identity {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] keystone.split_loggers = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] keystone.status_code_retries = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] keystone.status_code_retry_delay = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] keystone.timeout = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] keystone.version = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.connection_uri = {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.cpu_mode = custom {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.cpu_model_extra_flags = [] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: WARNING oslo_config.cfg [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] Deprecated: Option "cpu_model" from group "libvirt" is deprecated. Use option "cpu_models" from group "libvirt". Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.cpu_models = ['Nehalem'] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.cpu_power_governor_high = performance {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.cpu_power_governor_low = powersave {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.cpu_power_management = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.device_detach_attempts = 8 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.device_detach_timeout = 20 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.disk_cachemodes = [] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.disk_prefix = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.enabled_perf_events = [] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.file_backed_memory = 0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.gid_maps = [] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.hw_disk_discard = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.hw_machine_type = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.images_rbd_ceph_conf = {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.images_rbd_glance_store_name = {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.images_rbd_pool = rbd {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.images_type = default {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.images_volume_group = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.inject_key = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.inject_partition = -2 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.inject_password = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.iscsi_iface = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.iser_use_multipath = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.live_migration_bandwidth = 0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.live_migration_downtime = 500 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.live_migration_inbound_addr = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.live_migration_permit_post_copy = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.live_migration_scheme = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.live_migration_timeout_action = abort {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.live_migration_tunnelled = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: WARNING oslo_config.cfg [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Apr 17 22:04:33 user nova-compute[71972]: live_migration_uri is deprecated for removal in favor of two other options that Apr 17 22:04:33 user nova-compute[71972]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Apr 17 22:04:33 user nova-compute[71972]: and ``live_migration_inbound_addr`` respectively. Apr 17 22:04:33 user nova-compute[71972]: ). Its value may be silently ignored in the future. Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.live_migration_uri = qemu+ssh://stack@%s/system {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.live_migration_with_native_tls = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.max_queues = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.nfs_mount_options = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.nfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.num_iser_scan_tries = 5 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.num_memory_encrypted_guests = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.num_pcie_ports = 0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.num_volume_scan_tries = 5 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.pmem_namespaces = [] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.quobyte_client_cfg = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/nova/mnt {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.rbd_connect_timeout = 5 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.rbd_secret_uuid = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.rbd_user = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.remote_filesystem_transport = ssh {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.rescue_image_id = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.rescue_kernel_id = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.rescue_ramdisk_id = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.rx_queue_size = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.smbfs_mount_options = {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/nova/mnt {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.snapshot_compression = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.snapshot_image_format = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.sparse_logical_volumes = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.swtpm_enabled = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.swtpm_group = tss {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.swtpm_user = tss {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.sysinfo_serial = unique {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.tx_queue_size = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.uid_maps = [] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.use_virtio_for_bridges = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.virt_type = kvm {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.volume_clear = zero {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.volume_clear_size = 0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.volume_use_multipath = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.vzstorage_cache_path = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.vzstorage_mount_group = qemu {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.vzstorage_mount_opts = [] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/nova/mnt {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.vzstorage_mount_user = stack {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] neutron.auth_section = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] neutron.auth_type = password {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] neutron.cafile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] neutron.certfile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] neutron.collect_timing = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] neutron.connect_retries = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] neutron.connect_retry_delay = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] neutron.default_floating_pool = public {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] neutron.endpoint_override = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] neutron.extension_sync_interval = 600 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] neutron.http_retries = 3 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] neutron.insecure = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] neutron.keyfile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] neutron.max_version = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] neutron.min_version = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] neutron.ovs_bridge = br-int {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] neutron.physnets = [] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] neutron.region_name = RegionOne {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] neutron.service_metadata_proxy = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] neutron.service_name = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] neutron.service_type = network {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] neutron.split_loggers = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] neutron.status_code_retries = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] neutron.status_code_retry_delay = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] neutron.timeout = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] neutron.version = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] notifications.bdms_in_notifications = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] notifications.default_level = INFO {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] notifications.notification_format = unversioned {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] notifications.notify_on_state_change = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] pci.alias = [] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] pci.device_spec = [] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] pci.report_in_placement = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.auth_section = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.auth_type = password {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.auth_url = http://10.0.0.210/identity {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.cafile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.certfile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.collect_timing = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.connect_retries = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.connect_retry_delay = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.default_domain_id = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.default_domain_name = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.domain_id = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.domain_name = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.endpoint_override = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.insecure = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.keyfile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.max_version = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.min_version = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.password = **** {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.project_domain_id = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.project_domain_name = Default {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.project_id = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.project_name = service {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.region_name = RegionOne {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.service_name = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.service_type = placement {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.split_loggers = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.status_code_retries = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.status_code_retry_delay = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.system_scope = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.timeout = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.trust_id = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.user_domain_id = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.user_domain_name = Default {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.user_id = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.username = placement {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] placement.version = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] quota.cores = 20 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] quota.count_usage_from_placement = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] quota.injected_file_content_bytes = 10240 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] quota.injected_file_path_length = 255 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] quota.injected_files = 5 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] quota.instances = 10 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] quota.key_pairs = 100 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] quota.metadata_items = 128 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] quota.ram = 51200 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] quota.recheck_quota = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] quota.server_group_members = 10 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] quota.server_groups = 10 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] rdp.enabled = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] scheduler.image_metadata_prefilter = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] scheduler.max_attempts = 3 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] scheduler.max_placement_results = 1000 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] scheduler.query_placement_for_availability_zone = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] scheduler.query_placement_for_image_type_support = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] scheduler.workers = 3 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] filter_scheduler.enabled_filters = ['AvailabilityZoneFilter', 'ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] filter_scheduler.host_subset_size = 1 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] filter_scheduler.isolated_hosts = [] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] filter_scheduler.isolated_images = [] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] filter_scheduler.pci_in_placement = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] filter_scheduler.track_instance_changes = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] metrics.required = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] metrics.weight_multiplier = 1.0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] metrics.weight_setting = [] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] serial_console.enabled = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] serial_console.port_range = 10000:20000 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] serial_console.serialproxy_port = 6083 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] service_user.auth_section = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] service_user.auth_type = password {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] service_user.cafile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] service_user.certfile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] service_user.collect_timing = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] service_user.insecure = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] service_user.keyfile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] service_user.send_service_user_token = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] service_user.split_loggers = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] service_user.timeout = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] spice.agent_enabled = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] spice.enabled = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] spice.html5proxy_base_url = http://10.0.0.210:6081/spice_auto.html {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] spice.html5proxy_port = 6082 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] spice.image_compression = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] spice.jpeg_compression = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] spice.playback_compression = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] spice.server_listen = 127.0.0.1 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] spice.streaming_mode = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] spice.zlib_compression = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] upgrade_levels.baseapi = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] upgrade_levels.cert = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] upgrade_levels.compute = auto {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] upgrade_levels.conductor = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] upgrade_levels.scheduler = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vendordata_dynamic_auth.auth_section = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vendordata_dynamic_auth.auth_type = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vendordata_dynamic_auth.cafile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vendordata_dynamic_auth.certfile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vendordata_dynamic_auth.insecure = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vendordata_dynamic_auth.keyfile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vendordata_dynamic_auth.timeout = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vmware.api_retry_count = 10 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vmware.ca_file = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vmware.cache_prefix = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vmware.cluster_name = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vmware.connection_pool_size = 10 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vmware.console_delay_seconds = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vmware.datastore_regex = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vmware.host_ip = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vmware.host_password = **** {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vmware.host_port = 443 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vmware.host_username = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vmware.insecure = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vmware.integration_bridge = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vmware.maximum_objects = 100 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vmware.pbm_default_policy = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vmware.pbm_enabled = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vmware.pbm_wsdl_location = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vmware.serial_port_proxy_uri = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vmware.serial_port_service_uri = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vmware.task_poll_interval = 0.5 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vmware.use_linked_clone = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vmware.vnc_keymap = en-us {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vmware.vnc_port = 5900 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vmware.vnc_port_total = 10000 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vnc.auth_schemes = ['none'] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vnc.enabled = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vnc.novncproxy_base_url = http://10.0.0.210:6080/vnc_lite.html {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vnc.novncproxy_port = 6080 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vnc.server_listen = 0.0.0.0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vnc.server_proxyclient_address = 10.0.0.210 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vnc.vencrypt_ca_certs = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vnc.vencrypt_client_cert = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vnc.vencrypt_client_key = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] workarounds.disable_group_policy_check_upcall = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] workarounds.disable_rootwrap = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] workarounds.enable_numa_live_migration = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] workarounds.libvirt_disable_apic = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] wsgi.client_socket_timeout = 900 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] wsgi.default_pool_size = 1000 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] wsgi.keep_alive = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] wsgi.max_header_line = 16384 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] wsgi.secure_proxy_ssl_header = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] wsgi.ssl_ca_file = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] wsgi.ssl_cert_file = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] wsgi.ssl_key_file = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] wsgi.tcp_keepidle = 600 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] zvm.ca_file = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] zvm.cloud_connector_url = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] zvm.image_tmp_path = /opt/stack/data/nova/images {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] zvm.reachable_timeout = 300 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_policy.enforce_new_defaults = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_policy.enforce_scope = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_policy.policy_default_rule = default {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_policy.policy_file = policy.yaml {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] profiler.connection_string = messaging:// {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] profiler.enabled = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] profiler.es_doc_type = notification {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] profiler.es_scroll_size = 10000 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] profiler.es_scroll_time = 2m {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] profiler.filter_error_trace = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] profiler.hmac_keys = SECRET_KEY {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] profiler.sentinel_service_name = mymaster {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] profiler.socket_timeout = 0.1 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] profiler.trace_sqlalchemy = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] remote_debug.host = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] remote_debug.port = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.rabbit_quroum_max_memory_bytes = 0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.rabbit_quroum_max_memory_length = 0 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.ssl = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_rabbit.ssl_version = {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_notifications.retry = -1 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_messaging_notifications.transport_url = **** {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_limit.auth_section = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_limit.auth_type = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_limit.cafile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_limit.certfile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_limit.collect_timing = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_limit.connect_retries = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_limit.connect_retry_delay = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_limit.endpoint_id = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_limit.endpoint_override = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_limit.insecure = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_limit.keyfile = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_limit.max_version = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_limit.min_version = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_limit.region_name = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_limit.service_name = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_limit.service_type = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_limit.split_loggers = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_limit.status_code_retries = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_limit.status_code_retry_delay = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_limit.timeout = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_limit.valid_interfaces = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_limit.version = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_reports.file_event_handler = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] oslo_reports.log_dir = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 12 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vif_plug_ovs_privileged.group = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vif_plug_ovs_privileged.thread_pool_size = 12 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] vif_plug_ovs_privileged.user = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] os_vif_linux_bridge.flat_interface = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] os_vif_ovs.isolate_vif = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] os_vif_ovs.ovsdb_interface = native {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] os_vif_ovs.per_port_bridge = False {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] os_brick.lock_path = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] privsep_osbrick.capabilities = [21] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] privsep_osbrick.group = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] privsep_osbrick.helper_command = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] privsep_osbrick.thread_pool_size = 12 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] privsep_osbrick.user = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] nova_sys_admin.group = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] nova_sys_admin.helper_command = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] nova_sys_admin.thread_pool_size = 12 {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] nova_sys_admin.user = None {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG oslo_service.service [None req-1315d11c-e9fa-4bfe-b3a9-e4fde2aba9f0 None None] ******************************************************************************** {{(pid=71972) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} Apr 17 22:04:33 user nova-compute[71972]: INFO nova.service [-] Starting compute node (version 0.0.0) Apr 17 22:04:33 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Starting native event thread {{(pid=71972) _init_events /opt/stack/nova/nova/virt/libvirt/host.py:492}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Starting green dispatch thread {{(pid=71972) _init_events /opt/stack/nova/nova/virt/libvirt/host.py:498}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Starting connection event dispatch thread {{(pid=71972) initialize /opt/stack/nova/nova/virt/libvirt/host.py:620}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Connecting to libvirt: qemu:///system {{(pid=71972) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:503}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Registering for lifecycle events {{(pid=71972) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:509}} Apr 17 22:04:33 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Registering for connection events: {{(pid=71972) _get_new_connection /opt/stack/nova/nova/virt/libvirt/host.py:530}} Apr 17 22:04:33 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Connection event '1' reason 'None' Apr 17 22:04:33 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Cannot update service status on host "user" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host user could not be found. Apr 17 22:04:33 user nova-compute[71972]: DEBUG nova.virt.libvirt.volume.mount [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Initialising _HostMountState generation 0 {{(pid=71972) host_up /opt/stack/nova/nova/virt/libvirt/volume/mount.py:130}} Apr 17 22:04:40 user nova-compute[71972]: INFO nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Libvirt host capabilities Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: e20c3142-5af9-7467-ecd8-70b2e4a210d6 Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: x86_64 Apr 17 22:04:40 user nova-compute[71972]: IvyBridge-IBRS Apr 17 22:04:40 user nova-compute[71972]: Intel Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: tcp Apr 17 22:04:40 user nova-compute[71972]: rdma Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: 8189224 Apr 17 22:04:40 user nova-compute[71972]: 2047306 Apr 17 22:04:40 user nova-compute[71972]: 0 Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: 8218764 Apr 17 22:04:40 user nova-compute[71972]: 2054691 Apr 17 22:04:40 user nova-compute[71972]: 0 Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: apparmor Apr 17 22:04:40 user nova-compute[71972]: 0 Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: dac Apr 17 22:04:40 user nova-compute[71972]: 0 Apr 17 22:04:40 user nova-compute[71972]: +64055:+108 Apr 17 22:04:40 user nova-compute[71972]: +64055:+108 Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: hvm Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: 64 Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-alpha Apr 17 22:04:40 user nova-compute[71972]: clipper Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: hvm Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: 32 Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-arm Apr 17 22:04:40 user nova-compute[71972]: integratorcp Apr 17 22:04:40 user nova-compute[71972]: ast2600-evb Apr 17 22:04:40 user nova-compute[71972]: borzoi Apr 17 22:04:40 user nova-compute[71972]: spitz Apr 17 22:04:40 user nova-compute[71972]: virt-2.7 Apr 17 22:04:40 user nova-compute[71972]: nuri Apr 17 22:04:40 user nova-compute[71972]: mcimx7d-sabre Apr 17 22:04:40 user nova-compute[71972]: romulus-bmc Apr 17 22:04:40 user nova-compute[71972]: virt-3.0 Apr 17 22:04:40 user nova-compute[71972]: virt-5.0 Apr 17 22:04:40 user nova-compute[71972]: npcm750-evb Apr 17 22:04:40 user nova-compute[71972]: virt-2.10 Apr 17 22:04:40 user nova-compute[71972]: rainier-bmc Apr 17 22:04:40 user nova-compute[71972]: mps3-an547 Apr 17 22:04:40 user nova-compute[71972]: musca-b1 Apr 17 22:04:40 user nova-compute[71972]: realview-pbx-a9 Apr 17 22:04:40 user nova-compute[71972]: versatileab Apr 17 22:04:40 user nova-compute[71972]: kzm Apr 17 22:04:40 user nova-compute[71972]: virt-2.8 Apr 17 22:04:40 user nova-compute[71972]: musca-a Apr 17 22:04:40 user nova-compute[71972]: virt-3.1 Apr 17 22:04:40 user nova-compute[71972]: mcimx6ul-evk Apr 17 22:04:40 user nova-compute[71972]: virt-5.1 Apr 17 22:04:40 user nova-compute[71972]: smdkc210 Apr 17 22:04:40 user nova-compute[71972]: sx1 Apr 17 22:04:40 user nova-compute[71972]: virt-2.11 Apr 17 22:04:40 user nova-compute[71972]: imx25-pdk Apr 17 22:04:40 user nova-compute[71972]: stm32vldiscovery Apr 17 22:04:40 user nova-compute[71972]: virt-2.9 Apr 17 22:04:40 user nova-compute[71972]: orangepi-pc Apr 17 22:04:40 user nova-compute[71972]: quanta-q71l-bmc Apr 17 22:04:40 user nova-compute[71972]: z2 Apr 17 22:04:40 user nova-compute[71972]: virt-5.2 Apr 17 22:04:40 user nova-compute[71972]: xilinx-zynq-a9 Apr 17 22:04:40 user nova-compute[71972]: tosa Apr 17 22:04:40 user nova-compute[71972]: mps2-an500 Apr 17 22:04:40 user nova-compute[71972]: virt-2.12 Apr 17 22:04:40 user nova-compute[71972]: mps2-an521 Apr 17 22:04:40 user nova-compute[71972]: sabrelite Apr 17 22:04:40 user nova-compute[71972]: mps2-an511 Apr 17 22:04:40 user nova-compute[71972]: canon-a1100 Apr 17 22:04:40 user nova-compute[71972]: realview-eb Apr 17 22:04:40 user nova-compute[71972]: quanta-gbs-bmc Apr 17 22:04:40 user nova-compute[71972]: emcraft-sf2 Apr 17 22:04:40 user nova-compute[71972]: realview-pb-a8 Apr 17 22:04:40 user nova-compute[71972]: virt-4.0 Apr 17 22:04:40 user nova-compute[71972]: raspi1ap Apr 17 22:04:40 user nova-compute[71972]: palmetto-bmc Apr 17 22:04:40 user nova-compute[71972]: sx1-v1 Apr 17 22:04:40 user nova-compute[71972]: n810 Apr 17 22:04:40 user nova-compute[71972]: g220a-bmc Apr 17 22:04:40 user nova-compute[71972]: n800 Apr 17 22:04:40 user nova-compute[71972]: tacoma-bmc Apr 17 22:04:40 user nova-compute[71972]: virt-4.1 Apr 17 22:04:40 user nova-compute[71972]: quanta-gsj Apr 17 22:04:40 user nova-compute[71972]: versatilepb Apr 17 22:04:40 user nova-compute[71972]: terrier Apr 17 22:04:40 user nova-compute[71972]: mainstone Apr 17 22:04:40 user nova-compute[71972]: realview-eb-mpcore Apr 17 22:04:40 user nova-compute[71972]: supermicrox11-bmc Apr 17 22:04:40 user nova-compute[71972]: virt-4.2 Apr 17 22:04:40 user nova-compute[71972]: witherspoon-bmc Apr 17 22:04:40 user nova-compute[71972]: mps3-an524 Apr 17 22:04:40 user nova-compute[71972]: swift-bmc Apr 17 22:04:40 user nova-compute[71972]: kudo-bmc Apr 17 22:04:40 user nova-compute[71972]: vexpress-a9 Apr 17 22:04:40 user nova-compute[71972]: midway Apr 17 22:04:40 user nova-compute[71972]: musicpal Apr 17 22:04:40 user nova-compute[71972]: lm3s811evb Apr 17 22:04:40 user nova-compute[71972]: lm3s6965evb Apr 17 22:04:40 user nova-compute[71972]: microbit Apr 17 22:04:40 user nova-compute[71972]: mps2-an505 Apr 17 22:04:40 user nova-compute[71972]: mps2-an385 Apr 17 22:04:40 user nova-compute[71972]: virt-6.0 Apr 17 22:04:40 user nova-compute[71972]: cubieboard Apr 17 22:04:40 user nova-compute[71972]: verdex Apr 17 22:04:40 user nova-compute[71972]: netduino2 Apr 17 22:04:40 user nova-compute[71972]: mps2-an386 Apr 17 22:04:40 user nova-compute[71972]: virt-6.1 Apr 17 22:04:40 user nova-compute[71972]: raspi2b Apr 17 22:04:40 user nova-compute[71972]: vexpress-a15 Apr 17 22:04:40 user nova-compute[71972]: fuji-bmc Apr 17 22:04:40 user nova-compute[71972]: virt-6.2 Apr 17 22:04:40 user nova-compute[71972]: virt Apr 17 22:04:40 user nova-compute[71972]: sonorapass-bmc Apr 17 22:04:40 user nova-compute[71972]: cheetah Apr 17 22:04:40 user nova-compute[71972]: virt-2.6 Apr 17 22:04:40 user nova-compute[71972]: ast2500-evb Apr 17 22:04:40 user nova-compute[71972]: highbank Apr 17 22:04:40 user nova-compute[71972]: akita Apr 17 22:04:40 user nova-compute[71972]: connex Apr 17 22:04:40 user nova-compute[71972]: netduinoplus2 Apr 17 22:04:40 user nova-compute[71972]: collie Apr 17 22:04:40 user nova-compute[71972]: raspi0 Apr 17 22:04:40 user nova-compute[71972]: fp5280g2-bmc Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: hvm Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: 32 Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-arm Apr 17 22:04:40 user nova-compute[71972]: integratorcp Apr 17 22:04:40 user nova-compute[71972]: ast2600-evb Apr 17 22:04:40 user nova-compute[71972]: borzoi Apr 17 22:04:40 user nova-compute[71972]: spitz Apr 17 22:04:40 user nova-compute[71972]: virt-2.7 Apr 17 22:04:40 user nova-compute[71972]: nuri Apr 17 22:04:40 user nova-compute[71972]: mcimx7d-sabre Apr 17 22:04:40 user nova-compute[71972]: romulus-bmc Apr 17 22:04:40 user nova-compute[71972]: virt-3.0 Apr 17 22:04:40 user nova-compute[71972]: virt-5.0 Apr 17 22:04:40 user nova-compute[71972]: npcm750-evb Apr 17 22:04:40 user nova-compute[71972]: virt-2.10 Apr 17 22:04:40 user nova-compute[71972]: rainier-bmc Apr 17 22:04:40 user nova-compute[71972]: mps3-an547 Apr 17 22:04:40 user nova-compute[71972]: musca-b1 Apr 17 22:04:40 user nova-compute[71972]: realview-pbx-a9 Apr 17 22:04:40 user nova-compute[71972]: versatileab Apr 17 22:04:40 user nova-compute[71972]: kzm Apr 17 22:04:40 user nova-compute[71972]: virt-2.8 Apr 17 22:04:40 user nova-compute[71972]: musca-a Apr 17 22:04:40 user nova-compute[71972]: virt-3.1 Apr 17 22:04:40 user nova-compute[71972]: mcimx6ul-evk Apr 17 22:04:40 user nova-compute[71972]: virt-5.1 Apr 17 22:04:40 user nova-compute[71972]: smdkc210 Apr 17 22:04:40 user nova-compute[71972]: sx1 Apr 17 22:04:40 user nova-compute[71972]: virt-2.11 Apr 17 22:04:40 user nova-compute[71972]: imx25-pdk Apr 17 22:04:40 user nova-compute[71972]: stm32vldiscovery Apr 17 22:04:40 user nova-compute[71972]: virt-2.9 Apr 17 22:04:40 user nova-compute[71972]: orangepi-pc Apr 17 22:04:40 user nova-compute[71972]: quanta-q71l-bmc Apr 17 22:04:40 user nova-compute[71972]: z2 Apr 17 22:04:40 user nova-compute[71972]: virt-5.2 Apr 17 22:04:40 user nova-compute[71972]: xilinx-zynq-a9 Apr 17 22:04:40 user nova-compute[71972]: tosa Apr 17 22:04:40 user nova-compute[71972]: mps2-an500 Apr 17 22:04:40 user nova-compute[71972]: virt-2.12 Apr 17 22:04:40 user nova-compute[71972]: mps2-an521 Apr 17 22:04:40 user nova-compute[71972]: sabrelite Apr 17 22:04:40 user nova-compute[71972]: mps2-an511 Apr 17 22:04:40 user nova-compute[71972]: canon-a1100 Apr 17 22:04:40 user nova-compute[71972]: realview-eb Apr 17 22:04:40 user nova-compute[71972]: quanta-gbs-bmc Apr 17 22:04:40 user nova-compute[71972]: emcraft-sf2 Apr 17 22:04:40 user nova-compute[71972]: realview-pb-a8 Apr 17 22:04:40 user nova-compute[71972]: virt-4.0 Apr 17 22:04:40 user nova-compute[71972]: raspi1ap Apr 17 22:04:40 user nova-compute[71972]: palmetto-bmc Apr 17 22:04:40 user nova-compute[71972]: sx1-v1 Apr 17 22:04:40 user nova-compute[71972]: n810 Apr 17 22:04:40 user nova-compute[71972]: g220a-bmc Apr 17 22:04:40 user nova-compute[71972]: n800 Apr 17 22:04:40 user nova-compute[71972]: tacoma-bmc Apr 17 22:04:40 user nova-compute[71972]: virt-4.1 Apr 17 22:04:40 user nova-compute[71972]: quanta-gsj Apr 17 22:04:40 user nova-compute[71972]: versatilepb Apr 17 22:04:40 user nova-compute[71972]: terrier Apr 17 22:04:40 user nova-compute[71972]: mainstone Apr 17 22:04:40 user nova-compute[71972]: realview-eb-mpcore Apr 17 22:04:40 user nova-compute[71972]: supermicrox11-bmc Apr 17 22:04:40 user nova-compute[71972]: virt-4.2 Apr 17 22:04:40 user nova-compute[71972]: witherspoon-bmc Apr 17 22:04:40 user nova-compute[71972]: mps3-an524 Apr 17 22:04:40 user nova-compute[71972]: swift-bmc Apr 17 22:04:40 user nova-compute[71972]: kudo-bmc Apr 17 22:04:40 user nova-compute[71972]: vexpress-a9 Apr 17 22:04:40 user nova-compute[71972]: midway Apr 17 22:04:40 user nova-compute[71972]: musicpal Apr 17 22:04:40 user nova-compute[71972]: lm3s811evb Apr 17 22:04:40 user nova-compute[71972]: lm3s6965evb Apr 17 22:04:40 user nova-compute[71972]: microbit Apr 17 22:04:40 user nova-compute[71972]: mps2-an505 Apr 17 22:04:40 user nova-compute[71972]: mps2-an385 Apr 17 22:04:40 user nova-compute[71972]: virt-6.0 Apr 17 22:04:40 user nova-compute[71972]: cubieboard Apr 17 22:04:40 user nova-compute[71972]: verdex Apr 17 22:04:40 user nova-compute[71972]: netduino2 Apr 17 22:04:40 user nova-compute[71972]: mps2-an386 Apr 17 22:04:40 user nova-compute[71972]: virt-6.1 Apr 17 22:04:40 user nova-compute[71972]: raspi2b Apr 17 22:04:40 user nova-compute[71972]: vexpress-a15 Apr 17 22:04:40 user nova-compute[71972]: fuji-bmc Apr 17 22:04:40 user nova-compute[71972]: virt-6.2 Apr 17 22:04:40 user nova-compute[71972]: virt Apr 17 22:04:40 user nova-compute[71972]: sonorapass-bmc Apr 17 22:04:40 user nova-compute[71972]: cheetah Apr 17 22:04:40 user nova-compute[71972]: virt-2.6 Apr 17 22:04:40 user nova-compute[71972]: ast2500-evb Apr 17 22:04:40 user nova-compute[71972]: highbank Apr 17 22:04:40 user nova-compute[71972]: akita Apr 17 22:04:40 user nova-compute[71972]: connex Apr 17 22:04:40 user nova-compute[71972]: netduinoplus2 Apr 17 22:04:40 user nova-compute[71972]: collie Apr 17 22:04:40 user nova-compute[71972]: raspi0 Apr 17 22:04:40 user nova-compute[71972]: fp5280g2-bmc Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: hvm Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: 64 Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-aarch64 Apr 17 22:04:40 user nova-compute[71972]: integratorcp Apr 17 22:04:40 user nova-compute[71972]: ast2600-evb Apr 17 22:04:40 user nova-compute[71972]: borzoi Apr 17 22:04:40 user nova-compute[71972]: spitz Apr 17 22:04:40 user nova-compute[71972]: virt-2.7 Apr 17 22:04:40 user nova-compute[71972]: nuri Apr 17 22:04:40 user nova-compute[71972]: mcimx7d-sabre Apr 17 22:04:40 user nova-compute[71972]: romulus-bmc Apr 17 22:04:40 user nova-compute[71972]: virt-3.0 Apr 17 22:04:40 user nova-compute[71972]: virt-5.0 Apr 17 22:04:40 user nova-compute[71972]: npcm750-evb Apr 17 22:04:40 user nova-compute[71972]: virt-2.10 Apr 17 22:04:40 user nova-compute[71972]: rainier-bmc Apr 17 22:04:40 user nova-compute[71972]: mps3-an547 Apr 17 22:04:40 user nova-compute[71972]: virt-2.8 Apr 17 22:04:40 user nova-compute[71972]: musca-b1 Apr 17 22:04:40 user nova-compute[71972]: realview-pbx-a9 Apr 17 22:04:40 user nova-compute[71972]: versatileab Apr 17 22:04:40 user nova-compute[71972]: kzm Apr 17 22:04:40 user nova-compute[71972]: musca-a Apr 17 22:04:40 user nova-compute[71972]: virt-3.1 Apr 17 22:04:40 user nova-compute[71972]: mcimx6ul-evk Apr 17 22:04:40 user nova-compute[71972]: virt-5.1 Apr 17 22:04:40 user nova-compute[71972]: smdkc210 Apr 17 22:04:40 user nova-compute[71972]: sx1 Apr 17 22:04:40 user nova-compute[71972]: virt-2.11 Apr 17 22:04:40 user nova-compute[71972]: imx25-pdk Apr 17 22:04:40 user nova-compute[71972]: stm32vldiscovery Apr 17 22:04:40 user nova-compute[71972]: virt-2.9 Apr 17 22:04:40 user nova-compute[71972]: orangepi-pc Apr 17 22:04:40 user nova-compute[71972]: quanta-q71l-bmc Apr 17 22:04:40 user nova-compute[71972]: z2 Apr 17 22:04:40 user nova-compute[71972]: virt-5.2 Apr 17 22:04:40 user nova-compute[71972]: xilinx-zynq-a9 Apr 17 22:04:40 user nova-compute[71972]: xlnx-zcu102 Apr 17 22:04:40 user nova-compute[71972]: tosa Apr 17 22:04:40 user nova-compute[71972]: mps2-an500 Apr 17 22:04:40 user nova-compute[71972]: virt-2.12 Apr 17 22:04:40 user nova-compute[71972]: mps2-an521 Apr 17 22:04:40 user nova-compute[71972]: sabrelite Apr 17 22:04:40 user nova-compute[71972]: mps2-an511 Apr 17 22:04:40 user nova-compute[71972]: canon-a1100 Apr 17 22:04:40 user nova-compute[71972]: realview-eb Apr 17 22:04:40 user nova-compute[71972]: quanta-gbs-bmc Apr 17 22:04:40 user nova-compute[71972]: emcraft-sf2 Apr 17 22:04:40 user nova-compute[71972]: realview-pb-a8 Apr 17 22:04:40 user nova-compute[71972]: sbsa-ref Apr 17 22:04:40 user nova-compute[71972]: virt-4.0 Apr 17 22:04:40 user nova-compute[71972]: raspi1ap Apr 17 22:04:40 user nova-compute[71972]: palmetto-bmc Apr 17 22:04:40 user nova-compute[71972]: sx1-v1 Apr 17 22:04:40 user nova-compute[71972]: n810 Apr 17 22:04:40 user nova-compute[71972]: g220a-bmc Apr 17 22:04:40 user nova-compute[71972]: n800 Apr 17 22:04:40 user nova-compute[71972]: tacoma-bmc Apr 17 22:04:40 user nova-compute[71972]: virt-4.1 Apr 17 22:04:40 user nova-compute[71972]: quanta-gsj Apr 17 22:04:40 user nova-compute[71972]: versatilepb Apr 17 22:04:40 user nova-compute[71972]: terrier Apr 17 22:04:40 user nova-compute[71972]: mainstone Apr 17 22:04:40 user nova-compute[71972]: realview-eb-mpcore Apr 17 22:04:40 user nova-compute[71972]: supermicrox11-bmc Apr 17 22:04:40 user nova-compute[71972]: virt-4.2 Apr 17 22:04:40 user nova-compute[71972]: witherspoon-bmc Apr 17 22:04:40 user nova-compute[71972]: mps3-an524 Apr 17 22:04:40 user nova-compute[71972]: swift-bmc Apr 17 22:04:40 user nova-compute[71972]: kudo-bmc Apr 17 22:04:40 user nova-compute[71972]: vexpress-a9 Apr 17 22:04:40 user nova-compute[71972]: midway Apr 17 22:04:40 user nova-compute[71972]: musicpal Apr 17 22:04:40 user nova-compute[71972]: lm3s811evb Apr 17 22:04:40 user nova-compute[71972]: lm3s6965evb Apr 17 22:04:40 user nova-compute[71972]: microbit Apr 17 22:04:40 user nova-compute[71972]: mps2-an505 Apr 17 22:04:40 user nova-compute[71972]: mps2-an385 Apr 17 22:04:40 user nova-compute[71972]: virt-6.0 Apr 17 22:04:40 user nova-compute[71972]: raspi3ap Apr 17 22:04:40 user nova-compute[71972]: cubieboard Apr 17 22:04:40 user nova-compute[71972]: verdex Apr 17 22:04:40 user nova-compute[71972]: netduino2 Apr 17 22:04:40 user nova-compute[71972]: xlnx-versal-virt Apr 17 22:04:40 user nova-compute[71972]: mps2-an386 Apr 17 22:04:40 user nova-compute[71972]: virt-6.1 Apr 17 22:04:40 user nova-compute[71972]: raspi3b Apr 17 22:04:40 user nova-compute[71972]: raspi2b Apr 17 22:04:40 user nova-compute[71972]: vexpress-a15 Apr 17 22:04:40 user nova-compute[71972]: fuji-bmc Apr 17 22:04:40 user nova-compute[71972]: virt-6.2 Apr 17 22:04:40 user nova-compute[71972]: virt Apr 17 22:04:40 user nova-compute[71972]: sonorapass-bmc Apr 17 22:04:40 user nova-compute[71972]: cheetah Apr 17 22:04:40 user nova-compute[71972]: virt-2.6 Apr 17 22:04:40 user nova-compute[71972]: ast2500-evb Apr 17 22:04:40 user nova-compute[71972]: highbank Apr 17 22:04:40 user nova-compute[71972]: akita Apr 17 22:04:40 user nova-compute[71972]: connex Apr 17 22:04:40 user nova-compute[71972]: netduinoplus2 Apr 17 22:04:40 user nova-compute[71972]: collie Apr 17 22:04:40 user nova-compute[71972]: raspi0 Apr 17 22:04:40 user nova-compute[71972]: fp5280g2-bmc Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: hvm Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: 32 Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-cris Apr 17 22:04:40 user nova-compute[71972]: axis-dev88 Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: hvm Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: 32 Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-i386 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-jammy Apr 17 22:04:40 user nova-compute[71972]: ubuntu Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-impish-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-q35-5.2 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-2.12 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-2.0 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-xenial Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-6.2 Apr 17 22:04:40 user nova-compute[71972]: pc Apr 17 22:04:40 user nova-compute[71972]: pc-q35-4.2 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-2.5 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-4.2 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-focal Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-hirsute Apr 17 22:04:40 user nova-compute[71972]: pc-q35-xenial Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-jammy-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-5.2 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-1.5 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-2.7 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-eoan-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-zesty Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-disco-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-q35-groovy Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-groovy Apr 17 22:04:40 user nova-compute[71972]: pc-q35-artful Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-2.2 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-trusty Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-eoan-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-q35-focal-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-q35-bionic-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-artful Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-2.7 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-6.1 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-yakkety Apr 17 22:04:40 user nova-compute[71972]: pc-q35-2.4 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-cosmic-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-q35-2.10 Apr 17 22:04:40 user nova-compute[71972]: x-remote Apr 17 22:04:40 user nova-compute[71972]: pc-q35-5.1 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-1.7 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-2.9 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-2.11 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-3.1 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-6.1 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-4.1 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-jammy Apr 17 22:04:40 user nova-compute[71972]: ubuntu-q35 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-2.4 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-4.1 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-eoan Apr 17 22:04:40 user nova-compute[71972]: pc-q35-jammy-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-5.1 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-2.9 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-bionic-hpb Apr 17 22:04:40 user nova-compute[71972]: isapc Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-1.4 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-cosmic Apr 17 22:04:40 user nova-compute[71972]: pc-q35-2.6 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-3.1 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-bionic Apr 17 22:04:40 user nova-compute[71972]: pc-q35-disco-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-cosmic Apr 17 22:04:40 user nova-compute[71972]: pc-q35-2.12 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-bionic Apr 17 22:04:40 user nova-compute[71972]: pc-q35-groovy-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-q35-disco Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-cosmic-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-2.1 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-wily Apr 17 22:04:40 user nova-compute[71972]: pc-q35-impish Apr 17 22:04:40 user nova-compute[71972]: pc-q35-6.0 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-impish Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-2.6 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-impish-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-q35-hirsute Apr 17 22:04:40 user nova-compute[71972]: pc-q35-4.0.1 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-hirsute-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-1.6 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-5.0 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-2.8 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-2.10 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-3.0 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-6.0 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-zesty Apr 17 22:04:40 user nova-compute[71972]: pc-q35-4.0 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-focal Apr 17 22:04:40 user nova-compute[71972]: microvm Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-2.3 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-focal-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-disco Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-4.0 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-groovy-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-hirsute-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-5.0 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-6.2 Apr 17 22:04:40 user nova-compute[71972]: q35 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-2.8 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-eoan Apr 17 22:04:40 user nova-compute[71972]: pc-q35-2.5 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-3.0 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-yakkety Apr 17 22:04:40 user nova-compute[71972]: pc-q35-2.11 Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: hvm Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: 32 Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-m68k Apr 17 22:04:40 user nova-compute[71972]: mcf5208evb Apr 17 22:04:40 user nova-compute[71972]: an5206 Apr 17 22:04:40 user nova-compute[71972]: virt-6.0 Apr 17 22:04:40 user nova-compute[71972]: q800 Apr 17 22:04:40 user nova-compute[71972]: virt-6.2 Apr 17 22:04:40 user nova-compute[71972]: virt Apr 17 22:04:40 user nova-compute[71972]: next-cube Apr 17 22:04:40 user nova-compute[71972]: virt-6.1 Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: hvm Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: 32 Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-microblaze Apr 17 22:04:40 user nova-compute[71972]: petalogix-s3adsp1800 Apr 17 22:04:40 user nova-compute[71972]: petalogix-ml605 Apr 17 22:04:40 user nova-compute[71972]: xlnx-zynqmp-pmu Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: hvm Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: 32 Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-microblazeel Apr 17 22:04:40 user nova-compute[71972]: petalogix-s3adsp1800 Apr 17 22:04:40 user nova-compute[71972]: petalogix-ml605 Apr 17 22:04:40 user nova-compute[71972]: xlnx-zynqmp-pmu Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: hvm Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: 32 Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-mips Apr 17 22:04:40 user nova-compute[71972]: malta Apr 17 22:04:40 user nova-compute[71972]: mipssim Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: hvm Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: 32 Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-mipsel Apr 17 22:04:40 user nova-compute[71972]: malta Apr 17 22:04:40 user nova-compute[71972]: mipssim Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: hvm Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: 64 Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-mips64 Apr 17 22:04:40 user nova-compute[71972]: malta Apr 17 22:04:40 user nova-compute[71972]: mipssim Apr 17 22:04:40 user nova-compute[71972]: pica61 Apr 17 22:04:40 user nova-compute[71972]: magnum Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: hvm Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: 64 Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-mips64el Apr 17 22:04:40 user nova-compute[71972]: malta Apr 17 22:04:40 user nova-compute[71972]: loongson3-virt Apr 17 22:04:40 user nova-compute[71972]: mipssim Apr 17 22:04:40 user nova-compute[71972]: pica61 Apr 17 22:04:40 user nova-compute[71972]: magnum Apr 17 22:04:40 user nova-compute[71972]: boston Apr 17 22:04:40 user nova-compute[71972]: fuloong2e Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: hvm Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: 32 Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-ppc Apr 17 22:04:40 user nova-compute[71972]: g3beige Apr 17 22:04:40 user nova-compute[71972]: virtex-ml507 Apr 17 22:04:40 user nova-compute[71972]: mac99 Apr 17 22:04:40 user nova-compute[71972]: ppce500 Apr 17 22:04:40 user nova-compute[71972]: pegasos2 Apr 17 22:04:40 user nova-compute[71972]: sam460ex Apr 17 22:04:40 user nova-compute[71972]: bamboo Apr 17 22:04:40 user nova-compute[71972]: 40p Apr 17 22:04:40 user nova-compute[71972]: ref405ep Apr 17 22:04:40 user nova-compute[71972]: mpc8544ds Apr 17 22:04:40 user nova-compute[71972]: taihu Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: hvm Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: 64 Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-ppc64 Apr 17 22:04:40 user nova-compute[71972]: pseries-jammy Apr 17 22:04:40 user nova-compute[71972]: pseries Apr 17 22:04:40 user nova-compute[71972]: powernv9 Apr 17 22:04:40 user nova-compute[71972]: powernv Apr 17 22:04:40 user nova-compute[71972]: taihu Apr 17 22:04:40 user nova-compute[71972]: pseries-4.1 Apr 17 22:04:40 user nova-compute[71972]: mpc8544ds Apr 17 22:04:40 user nova-compute[71972]: pseries-6.1 Apr 17 22:04:40 user nova-compute[71972]: pseries-2.5 Apr 17 22:04:40 user nova-compute[71972]: powernv10 Apr 17 22:04:40 user nova-compute[71972]: pseries-xenial Apr 17 22:04:40 user nova-compute[71972]: pseries-4.2 Apr 17 22:04:40 user nova-compute[71972]: pseries-6.2 Apr 17 22:04:40 user nova-compute[71972]: pseries-yakkety Apr 17 22:04:40 user nova-compute[71972]: pseries-2.6 Apr 17 22:04:40 user nova-compute[71972]: ppce500 Apr 17 22:04:40 user nova-compute[71972]: pseries-bionic-sxxm Apr 17 22:04:40 user nova-compute[71972]: pseries-2.7 Apr 17 22:04:40 user nova-compute[71972]: pseries-3.0 Apr 17 22:04:40 user nova-compute[71972]: pseries-5.0 Apr 17 22:04:40 user nova-compute[71972]: 40p Apr 17 22:04:40 user nova-compute[71972]: pseries-2.8 Apr 17 22:04:40 user nova-compute[71972]: pegasos2 Apr 17 22:04:40 user nova-compute[71972]: pseries-hirsute Apr 17 22:04:40 user nova-compute[71972]: pseries-3.1 Apr 17 22:04:40 user nova-compute[71972]: pseries-5.1 Apr 17 22:04:40 user nova-compute[71972]: pseries-eoan Apr 17 22:04:40 user nova-compute[71972]: pseries-2.9 Apr 17 22:04:40 user nova-compute[71972]: pseries-zesty Apr 17 22:04:40 user nova-compute[71972]: bamboo Apr 17 22:04:40 user nova-compute[71972]: pseries-groovy Apr 17 22:04:40 user nova-compute[71972]: pseries-focal Apr 17 22:04:40 user nova-compute[71972]: g3beige Apr 17 22:04:40 user nova-compute[71972]: pseries-5.2 Apr 17 22:04:40 user nova-compute[71972]: pseries-disco Apr 17 22:04:40 user nova-compute[71972]: pseries-2.12-sxxm Apr 17 22:04:40 user nova-compute[71972]: pseries-2.10 Apr 17 22:04:40 user nova-compute[71972]: virtex-ml507 Apr 17 22:04:40 user nova-compute[71972]: pseries-2.11 Apr 17 22:04:40 user nova-compute[71972]: pseries-2.1 Apr 17 22:04:40 user nova-compute[71972]: pseries-cosmic Apr 17 22:04:40 user nova-compute[71972]: pseries-bionic Apr 17 22:04:40 user nova-compute[71972]: pseries-2.12 Apr 17 22:04:40 user nova-compute[71972]: pseries-2.2 Apr 17 22:04:40 user nova-compute[71972]: mac99 Apr 17 22:04:40 user nova-compute[71972]: pseries-impish Apr 17 22:04:40 user nova-compute[71972]: pseries-artful Apr 17 22:04:40 user nova-compute[71972]: sam460ex Apr 17 22:04:40 user nova-compute[71972]: ref405ep Apr 17 22:04:40 user nova-compute[71972]: pseries-2.3 Apr 17 22:04:40 user nova-compute[71972]: powernv8 Apr 17 22:04:40 user nova-compute[71972]: pseries-4.0 Apr 17 22:04:40 user nova-compute[71972]: pseries-6.0 Apr 17 22:04:40 user nova-compute[71972]: pseries-2.4 Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: hvm Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: 64 Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-ppc64le Apr 17 22:04:40 user nova-compute[71972]: pseries-jammy Apr 17 22:04:40 user nova-compute[71972]: pseries Apr 17 22:04:40 user nova-compute[71972]: powernv9 Apr 17 22:04:40 user nova-compute[71972]: powernv Apr 17 22:04:40 user nova-compute[71972]: taihu Apr 17 22:04:40 user nova-compute[71972]: pseries-4.1 Apr 17 22:04:40 user nova-compute[71972]: mpc8544ds Apr 17 22:04:40 user nova-compute[71972]: pseries-6.1 Apr 17 22:04:40 user nova-compute[71972]: pseries-2.5 Apr 17 22:04:40 user nova-compute[71972]: powernv10 Apr 17 22:04:40 user nova-compute[71972]: pseries-xenial Apr 17 22:04:40 user nova-compute[71972]: pseries-4.2 Apr 17 22:04:40 user nova-compute[71972]: pseries-6.2 Apr 17 22:04:40 user nova-compute[71972]: pseries-yakkety Apr 17 22:04:40 user nova-compute[71972]: pseries-2.6 Apr 17 22:04:40 user nova-compute[71972]: ppce500 Apr 17 22:04:40 user nova-compute[71972]: pseries-bionic-sxxm Apr 17 22:04:40 user nova-compute[71972]: pseries-2.7 Apr 17 22:04:40 user nova-compute[71972]: pseries-3.0 Apr 17 22:04:40 user nova-compute[71972]: pseries-5.0 Apr 17 22:04:40 user nova-compute[71972]: 40p Apr 17 22:04:40 user nova-compute[71972]: pseries-2.8 Apr 17 22:04:40 user nova-compute[71972]: pegasos2 Apr 17 22:04:40 user nova-compute[71972]: pseries-hirsute Apr 17 22:04:40 user nova-compute[71972]: pseries-3.1 Apr 17 22:04:40 user nova-compute[71972]: pseries-5.1 Apr 17 22:04:40 user nova-compute[71972]: pseries-eoan Apr 17 22:04:40 user nova-compute[71972]: pseries-2.9 Apr 17 22:04:40 user nova-compute[71972]: pseries-zesty Apr 17 22:04:40 user nova-compute[71972]: bamboo Apr 17 22:04:40 user nova-compute[71972]: pseries-groovy Apr 17 22:04:40 user nova-compute[71972]: pseries-focal Apr 17 22:04:40 user nova-compute[71972]: g3beige Apr 17 22:04:40 user nova-compute[71972]: pseries-5.2 Apr 17 22:04:40 user nova-compute[71972]: pseries-disco Apr 17 22:04:40 user nova-compute[71972]: pseries-2.12-sxxm Apr 17 22:04:40 user nova-compute[71972]: pseries-2.10 Apr 17 22:04:40 user nova-compute[71972]: virtex-ml507 Apr 17 22:04:40 user nova-compute[71972]: pseries-2.11 Apr 17 22:04:40 user nova-compute[71972]: pseries-2.1 Apr 17 22:04:40 user nova-compute[71972]: pseries-cosmic Apr 17 22:04:40 user nova-compute[71972]: pseries-bionic Apr 17 22:04:40 user nova-compute[71972]: pseries-2.12 Apr 17 22:04:40 user nova-compute[71972]: pseries-2.2 Apr 17 22:04:40 user nova-compute[71972]: mac99 Apr 17 22:04:40 user nova-compute[71972]: pseries-impish Apr 17 22:04:40 user nova-compute[71972]: pseries-artful Apr 17 22:04:40 user nova-compute[71972]: sam460ex Apr 17 22:04:40 user nova-compute[71972]: ref405ep Apr 17 22:04:40 user nova-compute[71972]: pseries-2.3 Apr 17 22:04:40 user nova-compute[71972]: powernv8 Apr 17 22:04:40 user nova-compute[71972]: pseries-4.0 Apr 17 22:04:40 user nova-compute[71972]: pseries-6.0 Apr 17 22:04:40 user nova-compute[71972]: pseries-2.4 Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: hvm Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: 32 Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-riscv32 Apr 17 22:04:40 user nova-compute[71972]: spike Apr 17 22:04:40 user nova-compute[71972]: opentitan Apr 17 22:04:40 user nova-compute[71972]: sifive_u Apr 17 22:04:40 user nova-compute[71972]: sifive_e Apr 17 22:04:40 user nova-compute[71972]: virt Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: hvm Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: 64 Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-riscv64 Apr 17 22:04:40 user nova-compute[71972]: spike Apr 17 22:04:40 user nova-compute[71972]: microchip-icicle-kit Apr 17 22:04:40 user nova-compute[71972]: sifive_u Apr 17 22:04:40 user nova-compute[71972]: shakti_c Apr 17 22:04:40 user nova-compute[71972]: sifive_e Apr 17 22:04:40 user nova-compute[71972]: virt Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: hvm Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: 64 Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-s390x Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-jammy Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-4.0 Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-5.2 Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-artful Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-3.1 Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-groovy Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-hirsute Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-disco Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-2.12 Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-2.6 Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-yakkety Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-eoan Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-2.9 Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-6.0 Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-5.1 Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-3.0 Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-4.2 Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-2.5 Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-2.11 Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-xenial Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-focal Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-2.8 Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-impish Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-bionic Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-5.0 Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-6.2 Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-zesty Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-4.1 Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-cosmic Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-2.4 Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-2.10 Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-2.7 Apr 17 22:04:40 user nova-compute[71972]: s390-ccw-virtio-6.1 Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: hvm Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: 32 Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-sh4 Apr 17 22:04:40 user nova-compute[71972]: shix Apr 17 22:04:40 user nova-compute[71972]: r2d Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: hvm Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: 64 Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-sh4eb Apr 17 22:04:40 user nova-compute[71972]: shix Apr 17 22:04:40 user nova-compute[71972]: r2d Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: hvm Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: 32 Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-sparc Apr 17 22:04:40 user nova-compute[71972]: SS-5 Apr 17 22:04:40 user nova-compute[71972]: SS-20 Apr 17 22:04:40 user nova-compute[71972]: LX Apr 17 22:04:40 user nova-compute[71972]: SPARCClassic Apr 17 22:04:40 user nova-compute[71972]: leon3_generic Apr 17 22:04:40 user nova-compute[71972]: SPARCbook Apr 17 22:04:40 user nova-compute[71972]: SS-4 Apr 17 22:04:40 user nova-compute[71972]: SS-600MP Apr 17 22:04:40 user nova-compute[71972]: SS-10 Apr 17 22:04:40 user nova-compute[71972]: Voyager Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: hvm Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: 64 Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-sparc64 Apr 17 22:04:40 user nova-compute[71972]: sun4u Apr 17 22:04:40 user nova-compute[71972]: niagara Apr 17 22:04:40 user nova-compute[71972]: sun4v Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: hvm Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: 64 Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-x86_64 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-jammy Apr 17 22:04:40 user nova-compute[71972]: ubuntu Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-impish-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-q35-5.2 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-2.12 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-2.0 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-xenial Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-6.2 Apr 17 22:04:40 user nova-compute[71972]: pc Apr 17 22:04:40 user nova-compute[71972]: pc-q35-4.2 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-2.5 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-4.2 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-hirsute Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-focal Apr 17 22:04:40 user nova-compute[71972]: pc-q35-xenial Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-jammy-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-5.2 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-1.5 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-2.7 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-eoan-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-zesty Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-disco-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-q35-groovy Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-groovy Apr 17 22:04:40 user nova-compute[71972]: pc-q35-artful Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-trusty Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-2.2 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-focal-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-eoan-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-q35-bionic-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-artful Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-2.7 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-6.1 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-yakkety Apr 17 22:04:40 user nova-compute[71972]: pc-q35-2.4 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-cosmic-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-q35-2.10 Apr 17 22:04:40 user nova-compute[71972]: x-remote Apr 17 22:04:40 user nova-compute[71972]: pc-q35-5.1 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-1.7 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-2.9 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-2.11 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-3.1 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-6.1 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-4.1 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-jammy Apr 17 22:04:40 user nova-compute[71972]: ubuntu-q35 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-2.4 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-4.1 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-eoan Apr 17 22:04:40 user nova-compute[71972]: pc-q35-jammy-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-5.1 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-2.9 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-bionic-hpb Apr 17 22:04:40 user nova-compute[71972]: isapc Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-1.4 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-cosmic Apr 17 22:04:40 user nova-compute[71972]: pc-q35-2.6 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-3.1 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-bionic Apr 17 22:04:40 user nova-compute[71972]: pc-q35-disco-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-cosmic Apr 17 22:04:40 user nova-compute[71972]: pc-q35-2.12 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-bionic Apr 17 22:04:40 user nova-compute[71972]: pc-q35-groovy-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-q35-disco Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-cosmic-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-2.1 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-wily Apr 17 22:04:40 user nova-compute[71972]: pc-q35-impish Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-2.6 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-6.0 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-impish Apr 17 22:04:40 user nova-compute[71972]: pc-q35-impish-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-q35-hirsute Apr 17 22:04:40 user nova-compute[71972]: pc-q35-4.0.1 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-hirsute-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-1.6 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-5.0 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-2.8 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-2.10 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-3.0 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-zesty Apr 17 22:04:40 user nova-compute[71972]: pc-q35-4.0 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-focal Apr 17 22:04:40 user nova-compute[71972]: microvm Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-6.0 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-2.3 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-disco Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-focal-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-4.0 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-groovy-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-hirsute-hpb Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-5.0 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-2.8 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-6.2 Apr 17 22:04:40 user nova-compute[71972]: q35 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-eoan Apr 17 22:04:40 user nova-compute[71972]: pc-q35-2.5 Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-3.0 Apr 17 22:04:40 user nova-compute[71972]: pc-q35-yakkety Apr 17 22:04:40 user nova-compute[71972]: pc-q35-2.11 Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: hvm Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: 32 Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-xtensa Apr 17 22:04:40 user nova-compute[71972]: sim Apr 17 22:04:40 user nova-compute[71972]: kc705 Apr 17 22:04:40 user nova-compute[71972]: ml605 Apr 17 22:04:40 user nova-compute[71972]: ml605-nommu Apr 17 22:04:40 user nova-compute[71972]: virt Apr 17 22:04:40 user nova-compute[71972]: lx60-nommu Apr 17 22:04:40 user nova-compute[71972]: lx200 Apr 17 22:04:40 user nova-compute[71972]: lx200-nommu Apr 17 22:04:40 user nova-compute[71972]: lx60 Apr 17 22:04:40 user nova-compute[71972]: kc705-nommu Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: hvm Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: 32 Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-xtensaeb Apr 17 22:04:40 user nova-compute[71972]: sim Apr 17 22:04:40 user nova-compute[71972]: kc705 Apr 17 22:04:40 user nova-compute[71972]: ml605 Apr 17 22:04:40 user nova-compute[71972]: ml605-nommu Apr 17 22:04:40 user nova-compute[71972]: virt Apr 17 22:04:40 user nova-compute[71972]: lx60-nommu Apr 17 22:04:40 user nova-compute[71972]: lx200 Apr 17 22:04:40 user nova-compute[71972]: lx200-nommu Apr 17 22:04:40 user nova-compute[71972]: lx60 Apr 17 22:04:40 user nova-compute[71972]: kc705-nommu Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Getting domain capabilities for alpha via machine types: {None} {{(pid=71972) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Error from libvirt when retrieving domain capabilities for arch alpha / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-alpha' on this host {{(pid=71972) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Getting domain capabilities for armv6l via machine types: {None, 'virt'} {{(pid=71972) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Error from libvirt when retrieving domain capabilities for arch armv6l / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=71972) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Error from libvirt when retrieving domain capabilities for arch armv6l / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=71972) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Getting domain capabilities for armv7l via machine types: {'virt'} {{(pid=71972) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Error from libvirt when retrieving domain capabilities for arch armv7l / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-arm' on this host {{(pid=71972) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Getting domain capabilities for aarch64 via machine types: {'virt'} {{(pid=71972) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Error from libvirt when retrieving domain capabilities for arch aarch64 / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-aarch64' on this host {{(pid=71972) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Getting domain capabilities for cris via machine types: {None} {{(pid=71972) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Error from libvirt when retrieving domain capabilities for arch cris / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-cris' on this host {{(pid=71972) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Getting domain capabilities for i686 via machine types: {'ubuntu', 'ubuntu-q35', 'pc', 'q35'} {{(pid=71972) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=ubuntu: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-i386 Apr 17 22:04:40 user nova-compute[71972]: kvm Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-jammy Apr 17 22:04:40 user nova-compute[71972]: i686 Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: /usr/share/OVMF/OVMF_CODE.fd Apr 17 22:04:40 user nova-compute[71972]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 17 22:04:40 user nova-compute[71972]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 17 22:04:40 user nova-compute[71972]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 17 22:04:40 user nova-compute[71972]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: rom Apr 17 22:04:40 user nova-compute[71972]: pflash Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: yes Apr 17 22:04:40 user nova-compute[71972]: no Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: no Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: off Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: on Apr 17 22:04:40 user nova-compute[71972]: off Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: IvyBridge-IBRS Apr 17 22:04:40 user nova-compute[71972]: Intel Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: qemu64 Apr 17 22:04:40 user nova-compute[71972]: qemu32 Apr 17 22:04:40 user nova-compute[71972]: phenom Apr 17 22:04:40 user nova-compute[71972]: pentium3 Apr 17 22:04:40 user nova-compute[71972]: pentium2 Apr 17 22:04:40 user nova-compute[71972]: pentium Apr 17 22:04:40 user nova-compute[71972]: n270 Apr 17 22:04:40 user nova-compute[71972]: kvm64 Apr 17 22:04:40 user nova-compute[71972]: kvm32 Apr 17 22:04:40 user nova-compute[71972]: coreduo Apr 17 22:04:40 user nova-compute[71972]: core2duo Apr 17 22:04:40 user nova-compute[71972]: athlon Apr 17 22:04:40 user nova-compute[71972]: Westmere-IBRS Apr 17 22:04:40 user nova-compute[71972]: Westmere Apr 17 22:04:40 user nova-compute[71972]: Snowridge Apr 17 22:04:40 user nova-compute[71972]: Skylake-Server-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Server-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Server Apr 17 22:04:40 user nova-compute[71972]: Skylake-Client-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Client-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Client Apr 17 22:04:40 user nova-compute[71972]: SandyBridge-IBRS Apr 17 22:04:40 user nova-compute[71972]: SandyBridge Apr 17 22:04:40 user nova-compute[71972]: Penryn Apr 17 22:04:40 user nova-compute[71972]: Opteron_G5 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G4 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G3 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G2 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G1 Apr 17 22:04:40 user nova-compute[71972]: Nehalem-IBRS Apr 17 22:04:40 user nova-compute[71972]: Nehalem Apr 17 22:04:40 user nova-compute[71972]: IvyBridge-IBRS Apr 17 22:04:40 user nova-compute[71972]: IvyBridge Apr 17 22:04:40 user nova-compute[71972]: Icelake-Server-noTSX Apr 17 22:04:40 user nova-compute[71972]: Icelake-Server Apr 17 22:04:40 user nova-compute[71972]: Icelake-Client-noTSX Apr 17 22:04:40 user nova-compute[71972]: Icelake-Client Apr 17 22:04:40 user nova-compute[71972]: Haswell-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Haswell-noTSX Apr 17 22:04:40 user nova-compute[71972]: Haswell-IBRS Apr 17 22:04:40 user nova-compute[71972]: Haswell Apr 17 22:04:40 user nova-compute[71972]: EPYC-Rome Apr 17 22:04:40 user nova-compute[71972]: EPYC-Milan Apr 17 22:04:40 user nova-compute[71972]: EPYC-IBPB Apr 17 22:04:40 user nova-compute[71972]: EPYC Apr 17 22:04:40 user nova-compute[71972]: Dhyana Apr 17 22:04:40 user nova-compute[71972]: Cooperlake Apr 17 22:04:40 user nova-compute[71972]: Conroe Apr 17 22:04:40 user nova-compute[71972]: Cascadelake-Server-noTSX Apr 17 22:04:40 user nova-compute[71972]: Cascadelake-Server Apr 17 22:04:40 user nova-compute[71972]: Broadwell-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Broadwell-noTSX Apr 17 22:04:40 user nova-compute[71972]: Broadwell-IBRS Apr 17 22:04:40 user nova-compute[71972]: Broadwell Apr 17 22:04:40 user nova-compute[71972]: 486 Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: file Apr 17 22:04:40 user nova-compute[71972]: anonymous Apr 17 22:04:40 user nova-compute[71972]: memfd Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: disk Apr 17 22:04:40 user nova-compute[71972]: cdrom Apr 17 22:04:40 user nova-compute[71972]: floppy Apr 17 22:04:40 user nova-compute[71972]: lun Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: ide Apr 17 22:04:40 user nova-compute[71972]: fdc Apr 17 22:04:40 user nova-compute[71972]: scsi Apr 17 22:04:40 user nova-compute[71972]: virtio Apr 17 22:04:40 user nova-compute[71972]: usb Apr 17 22:04:40 user nova-compute[71972]: sata Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: virtio Apr 17 22:04:40 user nova-compute[71972]: virtio-transitional Apr 17 22:04:40 user nova-compute[71972]: virtio-non-transitional Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: sdl Apr 17 22:04:40 user nova-compute[71972]: vnc Apr 17 22:04:40 user nova-compute[71972]: spice Apr 17 22:04:40 user nova-compute[71972]: egl-headless Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: subsystem Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: default Apr 17 22:04:40 user nova-compute[71972]: mandatory Apr 17 22:04:40 user nova-compute[71972]: requisite Apr 17 22:04:40 user nova-compute[71972]: optional Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: usb Apr 17 22:04:40 user nova-compute[71972]: pci Apr 17 22:04:40 user nova-compute[71972]: scsi Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: virtio Apr 17 22:04:40 user nova-compute[71972]: virtio-transitional Apr 17 22:04:40 user nova-compute[71972]: virtio-non-transitional Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: random Apr 17 22:04:40 user nova-compute[71972]: egd Apr 17 22:04:40 user nova-compute[71972]: builtin Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: path Apr 17 22:04:40 user nova-compute[71972]: handle Apr 17 22:04:40 user nova-compute[71972]: virtiofs Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: tpm-tis Apr 17 22:04:40 user nova-compute[71972]: tpm-crb Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: passthrough Apr 17 22:04:40 user nova-compute[71972]: emulator Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: {{(pid=71972) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=ubuntu-q35: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-i386 Apr 17 22:04:40 user nova-compute[71972]: kvm Apr 17 22:04:40 user nova-compute[71972]: pc-q35-jammy Apr 17 22:04:40 user nova-compute[71972]: i686 Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: /usr/share/OVMF/OVMF_CODE.fd Apr 17 22:04:40 user nova-compute[71972]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 17 22:04:40 user nova-compute[71972]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 17 22:04:40 user nova-compute[71972]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 17 22:04:40 user nova-compute[71972]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: rom Apr 17 22:04:40 user nova-compute[71972]: pflash Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: yes Apr 17 22:04:40 user nova-compute[71972]: no Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: no Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: off Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: on Apr 17 22:04:40 user nova-compute[71972]: off Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: IvyBridge-IBRS Apr 17 22:04:40 user nova-compute[71972]: Intel Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: qemu64 Apr 17 22:04:40 user nova-compute[71972]: qemu32 Apr 17 22:04:40 user nova-compute[71972]: phenom Apr 17 22:04:40 user nova-compute[71972]: pentium3 Apr 17 22:04:40 user nova-compute[71972]: pentium2 Apr 17 22:04:40 user nova-compute[71972]: pentium Apr 17 22:04:40 user nova-compute[71972]: n270 Apr 17 22:04:40 user nova-compute[71972]: kvm64 Apr 17 22:04:40 user nova-compute[71972]: kvm32 Apr 17 22:04:40 user nova-compute[71972]: coreduo Apr 17 22:04:40 user nova-compute[71972]: core2duo Apr 17 22:04:40 user nova-compute[71972]: athlon Apr 17 22:04:40 user nova-compute[71972]: Westmere-IBRS Apr 17 22:04:40 user nova-compute[71972]: Westmere Apr 17 22:04:40 user nova-compute[71972]: Snowridge Apr 17 22:04:40 user nova-compute[71972]: Skylake-Server-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Server-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Server Apr 17 22:04:40 user nova-compute[71972]: Skylake-Client-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Client-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Client Apr 17 22:04:40 user nova-compute[71972]: SandyBridge-IBRS Apr 17 22:04:40 user nova-compute[71972]: SandyBridge Apr 17 22:04:40 user nova-compute[71972]: Penryn Apr 17 22:04:40 user nova-compute[71972]: Opteron_G5 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G4 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G3 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G2 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G1 Apr 17 22:04:40 user nova-compute[71972]: Nehalem-IBRS Apr 17 22:04:40 user nova-compute[71972]: Nehalem Apr 17 22:04:40 user nova-compute[71972]: IvyBridge-IBRS Apr 17 22:04:40 user nova-compute[71972]: IvyBridge Apr 17 22:04:40 user nova-compute[71972]: Icelake-Server-noTSX Apr 17 22:04:40 user nova-compute[71972]: Icelake-Server Apr 17 22:04:40 user nova-compute[71972]: Icelake-Client-noTSX Apr 17 22:04:40 user nova-compute[71972]: Icelake-Client Apr 17 22:04:40 user nova-compute[71972]: Haswell-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Haswell-noTSX Apr 17 22:04:40 user nova-compute[71972]: Haswell-IBRS Apr 17 22:04:40 user nova-compute[71972]: Haswell Apr 17 22:04:40 user nova-compute[71972]: EPYC-Rome Apr 17 22:04:40 user nova-compute[71972]: EPYC-Milan Apr 17 22:04:40 user nova-compute[71972]: EPYC-IBPB Apr 17 22:04:40 user nova-compute[71972]: EPYC Apr 17 22:04:40 user nova-compute[71972]: Dhyana Apr 17 22:04:40 user nova-compute[71972]: Cooperlake Apr 17 22:04:40 user nova-compute[71972]: Conroe Apr 17 22:04:40 user nova-compute[71972]: Cascadelake-Server-noTSX Apr 17 22:04:40 user nova-compute[71972]: Cascadelake-Server Apr 17 22:04:40 user nova-compute[71972]: Broadwell-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Broadwell-noTSX Apr 17 22:04:40 user nova-compute[71972]: Broadwell-IBRS Apr 17 22:04:40 user nova-compute[71972]: Broadwell Apr 17 22:04:40 user nova-compute[71972]: 486 Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: file Apr 17 22:04:40 user nova-compute[71972]: anonymous Apr 17 22:04:40 user nova-compute[71972]: memfd Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: disk Apr 17 22:04:40 user nova-compute[71972]: cdrom Apr 17 22:04:40 user nova-compute[71972]: floppy Apr 17 22:04:40 user nova-compute[71972]: lun Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: fdc Apr 17 22:04:40 user nova-compute[71972]: scsi Apr 17 22:04:40 user nova-compute[71972]: virtio Apr 17 22:04:40 user nova-compute[71972]: usb Apr 17 22:04:40 user nova-compute[71972]: sata Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: virtio Apr 17 22:04:40 user nova-compute[71972]: virtio-transitional Apr 17 22:04:40 user nova-compute[71972]: virtio-non-transitional Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: sdl Apr 17 22:04:40 user nova-compute[71972]: vnc Apr 17 22:04:40 user nova-compute[71972]: spice Apr 17 22:04:40 user nova-compute[71972]: egl-headless Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: subsystem Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: default Apr 17 22:04:40 user nova-compute[71972]: mandatory Apr 17 22:04:40 user nova-compute[71972]: requisite Apr 17 22:04:40 user nova-compute[71972]: optional Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: usb Apr 17 22:04:40 user nova-compute[71972]: pci Apr 17 22:04:40 user nova-compute[71972]: scsi Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: virtio Apr 17 22:04:40 user nova-compute[71972]: virtio-transitional Apr 17 22:04:40 user nova-compute[71972]: virtio-non-transitional Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: random Apr 17 22:04:40 user nova-compute[71972]: egd Apr 17 22:04:40 user nova-compute[71972]: builtin Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: path Apr 17 22:04:40 user nova-compute[71972]: handle Apr 17 22:04:40 user nova-compute[71972]: virtiofs Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: tpm-tis Apr 17 22:04:40 user nova-compute[71972]: tpm-crb Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: passthrough Apr 17 22:04:40 user nova-compute[71972]: emulator Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: {{(pid=71972) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-i386 Apr 17 22:04:40 user nova-compute[71972]: kvm Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-6.2 Apr 17 22:04:40 user nova-compute[71972]: i686 Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: /usr/share/OVMF/OVMF_CODE.fd Apr 17 22:04:40 user nova-compute[71972]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 17 22:04:40 user nova-compute[71972]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 17 22:04:40 user nova-compute[71972]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 17 22:04:40 user nova-compute[71972]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: rom Apr 17 22:04:40 user nova-compute[71972]: pflash Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: yes Apr 17 22:04:40 user nova-compute[71972]: no Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: no Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: off Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: on Apr 17 22:04:40 user nova-compute[71972]: off Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: IvyBridge-IBRS Apr 17 22:04:40 user nova-compute[71972]: Intel Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: qemu64 Apr 17 22:04:40 user nova-compute[71972]: qemu32 Apr 17 22:04:40 user nova-compute[71972]: phenom Apr 17 22:04:40 user nova-compute[71972]: pentium3 Apr 17 22:04:40 user nova-compute[71972]: pentium2 Apr 17 22:04:40 user nova-compute[71972]: pentium Apr 17 22:04:40 user nova-compute[71972]: n270 Apr 17 22:04:40 user nova-compute[71972]: kvm64 Apr 17 22:04:40 user nova-compute[71972]: kvm32 Apr 17 22:04:40 user nova-compute[71972]: coreduo Apr 17 22:04:40 user nova-compute[71972]: core2duo Apr 17 22:04:40 user nova-compute[71972]: athlon Apr 17 22:04:40 user nova-compute[71972]: Westmere-IBRS Apr 17 22:04:40 user nova-compute[71972]: Westmere Apr 17 22:04:40 user nova-compute[71972]: Snowridge Apr 17 22:04:40 user nova-compute[71972]: Skylake-Server-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Server-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Server Apr 17 22:04:40 user nova-compute[71972]: Skylake-Client-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Client-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Client Apr 17 22:04:40 user nova-compute[71972]: SandyBridge-IBRS Apr 17 22:04:40 user nova-compute[71972]: SandyBridge Apr 17 22:04:40 user nova-compute[71972]: Penryn Apr 17 22:04:40 user nova-compute[71972]: Opteron_G5 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G4 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G3 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G2 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G1 Apr 17 22:04:40 user nova-compute[71972]: Nehalem-IBRS Apr 17 22:04:40 user nova-compute[71972]: Nehalem Apr 17 22:04:40 user nova-compute[71972]: IvyBridge-IBRS Apr 17 22:04:40 user nova-compute[71972]: IvyBridge Apr 17 22:04:40 user nova-compute[71972]: Icelake-Server-noTSX Apr 17 22:04:40 user nova-compute[71972]: Icelake-Server Apr 17 22:04:40 user nova-compute[71972]: Icelake-Client-noTSX Apr 17 22:04:40 user nova-compute[71972]: Icelake-Client Apr 17 22:04:40 user nova-compute[71972]: Haswell-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Haswell-noTSX Apr 17 22:04:40 user nova-compute[71972]: Haswell-IBRS Apr 17 22:04:40 user nova-compute[71972]: Haswell Apr 17 22:04:40 user nova-compute[71972]: EPYC-Rome Apr 17 22:04:40 user nova-compute[71972]: EPYC-Milan Apr 17 22:04:40 user nova-compute[71972]: EPYC-IBPB Apr 17 22:04:40 user nova-compute[71972]: EPYC Apr 17 22:04:40 user nova-compute[71972]: Dhyana Apr 17 22:04:40 user nova-compute[71972]: Cooperlake Apr 17 22:04:40 user nova-compute[71972]: Conroe Apr 17 22:04:40 user nova-compute[71972]: Cascadelake-Server-noTSX Apr 17 22:04:40 user nova-compute[71972]: Cascadelake-Server Apr 17 22:04:40 user nova-compute[71972]: Broadwell-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Broadwell-noTSX Apr 17 22:04:40 user nova-compute[71972]: Broadwell-IBRS Apr 17 22:04:40 user nova-compute[71972]: Broadwell Apr 17 22:04:40 user nova-compute[71972]: 486 Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: file Apr 17 22:04:40 user nova-compute[71972]: anonymous Apr 17 22:04:40 user nova-compute[71972]: memfd Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: disk Apr 17 22:04:40 user nova-compute[71972]: cdrom Apr 17 22:04:40 user nova-compute[71972]: floppy Apr 17 22:04:40 user nova-compute[71972]: lun Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: ide Apr 17 22:04:40 user nova-compute[71972]: fdc Apr 17 22:04:40 user nova-compute[71972]: scsi Apr 17 22:04:40 user nova-compute[71972]: virtio Apr 17 22:04:40 user nova-compute[71972]: usb Apr 17 22:04:40 user nova-compute[71972]: sata Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: virtio Apr 17 22:04:40 user nova-compute[71972]: virtio-transitional Apr 17 22:04:40 user nova-compute[71972]: virtio-non-transitional Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: sdl Apr 17 22:04:40 user nova-compute[71972]: vnc Apr 17 22:04:40 user nova-compute[71972]: spice Apr 17 22:04:40 user nova-compute[71972]: egl-headless Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: subsystem Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: default Apr 17 22:04:40 user nova-compute[71972]: mandatory Apr 17 22:04:40 user nova-compute[71972]: requisite Apr 17 22:04:40 user nova-compute[71972]: optional Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: usb Apr 17 22:04:40 user nova-compute[71972]: pci Apr 17 22:04:40 user nova-compute[71972]: scsi Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: virtio Apr 17 22:04:40 user nova-compute[71972]: virtio-transitional Apr 17 22:04:40 user nova-compute[71972]: virtio-non-transitional Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: random Apr 17 22:04:40 user nova-compute[71972]: egd Apr 17 22:04:40 user nova-compute[71972]: builtin Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: path Apr 17 22:04:40 user nova-compute[71972]: handle Apr 17 22:04:40 user nova-compute[71972]: virtiofs Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: tpm-tis Apr 17 22:04:40 user nova-compute[71972]: tpm-crb Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: passthrough Apr 17 22:04:40 user nova-compute[71972]: emulator Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: {{(pid=71972) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-i386 Apr 17 22:04:40 user nova-compute[71972]: kvm Apr 17 22:04:40 user nova-compute[71972]: pc-q35-6.2 Apr 17 22:04:40 user nova-compute[71972]: i686 Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: /usr/share/OVMF/OVMF_CODE.fd Apr 17 22:04:40 user nova-compute[71972]: /usr/share/OVMF/OVMF_CODE.secboot.fd Apr 17 22:04:40 user nova-compute[71972]: /usr/share/AAVMF/AAVMF_CODE.fd Apr 17 22:04:40 user nova-compute[71972]: /usr/share/AAVMF/AAVMF32_CODE.fd Apr 17 22:04:40 user nova-compute[71972]: /usr/share/OVMF/OVMF_CODE.ms.fd Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: rom Apr 17 22:04:40 user nova-compute[71972]: pflash Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: yes Apr 17 22:04:40 user nova-compute[71972]: no Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: no Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: off Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: on Apr 17 22:04:40 user nova-compute[71972]: off Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: IvyBridge-IBRS Apr 17 22:04:40 user nova-compute[71972]: Intel Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: qemu64 Apr 17 22:04:40 user nova-compute[71972]: qemu32 Apr 17 22:04:40 user nova-compute[71972]: phenom Apr 17 22:04:40 user nova-compute[71972]: pentium3 Apr 17 22:04:40 user nova-compute[71972]: pentium2 Apr 17 22:04:40 user nova-compute[71972]: pentium Apr 17 22:04:40 user nova-compute[71972]: n270 Apr 17 22:04:40 user nova-compute[71972]: kvm64 Apr 17 22:04:40 user nova-compute[71972]: kvm32 Apr 17 22:04:40 user nova-compute[71972]: coreduo Apr 17 22:04:40 user nova-compute[71972]: core2duo Apr 17 22:04:40 user nova-compute[71972]: athlon Apr 17 22:04:40 user nova-compute[71972]: Westmere-IBRS Apr 17 22:04:40 user nova-compute[71972]: Westmere Apr 17 22:04:40 user nova-compute[71972]: Snowridge Apr 17 22:04:40 user nova-compute[71972]: Skylake-Server-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Server-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Server Apr 17 22:04:40 user nova-compute[71972]: Skylake-Client-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Client-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Client Apr 17 22:04:40 user nova-compute[71972]: SandyBridge-IBRS Apr 17 22:04:40 user nova-compute[71972]: SandyBridge Apr 17 22:04:40 user nova-compute[71972]: Penryn Apr 17 22:04:40 user nova-compute[71972]: Opteron_G5 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G4 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G3 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G2 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G1 Apr 17 22:04:40 user nova-compute[71972]: Nehalem-IBRS Apr 17 22:04:40 user nova-compute[71972]: Nehalem Apr 17 22:04:40 user nova-compute[71972]: IvyBridge-IBRS Apr 17 22:04:40 user nova-compute[71972]: IvyBridge Apr 17 22:04:40 user nova-compute[71972]: Icelake-Server-noTSX Apr 17 22:04:40 user nova-compute[71972]: Icelake-Server Apr 17 22:04:40 user nova-compute[71972]: Icelake-Client-noTSX Apr 17 22:04:40 user nova-compute[71972]: Icelake-Client Apr 17 22:04:40 user nova-compute[71972]: Haswell-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Haswell-noTSX Apr 17 22:04:40 user nova-compute[71972]: Haswell-IBRS Apr 17 22:04:40 user nova-compute[71972]: Haswell Apr 17 22:04:40 user nova-compute[71972]: EPYC-Rome Apr 17 22:04:40 user nova-compute[71972]: EPYC-Milan Apr 17 22:04:40 user nova-compute[71972]: EPYC-IBPB Apr 17 22:04:40 user nova-compute[71972]: EPYC Apr 17 22:04:40 user nova-compute[71972]: Dhyana Apr 17 22:04:40 user nova-compute[71972]: Cooperlake Apr 17 22:04:40 user nova-compute[71972]: Conroe Apr 17 22:04:40 user nova-compute[71972]: Cascadelake-Server-noTSX Apr 17 22:04:40 user nova-compute[71972]: Cascadelake-Server Apr 17 22:04:40 user nova-compute[71972]: Broadwell-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Broadwell-noTSX Apr 17 22:04:40 user nova-compute[71972]: Broadwell-IBRS Apr 17 22:04:40 user nova-compute[71972]: Broadwell Apr 17 22:04:40 user nova-compute[71972]: 486 Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: file Apr 17 22:04:40 user nova-compute[71972]: anonymous Apr 17 22:04:40 user nova-compute[71972]: memfd Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: disk Apr 17 22:04:40 user nova-compute[71972]: cdrom Apr 17 22:04:40 user nova-compute[71972]: floppy Apr 17 22:04:40 user nova-compute[71972]: lun Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: fdc Apr 17 22:04:40 user nova-compute[71972]: scsi Apr 17 22:04:40 user nova-compute[71972]: virtio Apr 17 22:04:40 user nova-compute[71972]: usb Apr 17 22:04:40 user nova-compute[71972]: sata Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: virtio Apr 17 22:04:40 user nova-compute[71972]: virtio-transitional Apr 17 22:04:40 user nova-compute[71972]: virtio-non-transitional Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: sdl Apr 17 22:04:40 user nova-compute[71972]: vnc Apr 17 22:04:40 user nova-compute[71972]: spice Apr 17 22:04:40 user nova-compute[71972]: egl-headless Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: subsystem Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: default Apr 17 22:04:40 user nova-compute[71972]: mandatory Apr 17 22:04:40 user nova-compute[71972]: requisite Apr 17 22:04:40 user nova-compute[71972]: optional Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: usb Apr 17 22:04:40 user nova-compute[71972]: pci Apr 17 22:04:40 user nova-compute[71972]: scsi Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: virtio Apr 17 22:04:40 user nova-compute[71972]: virtio-transitional Apr 17 22:04:40 user nova-compute[71972]: virtio-non-transitional Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: random Apr 17 22:04:40 user nova-compute[71972]: egd Apr 17 22:04:40 user nova-compute[71972]: builtin Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: path Apr 17 22:04:40 user nova-compute[71972]: handle Apr 17 22:04:40 user nova-compute[71972]: virtiofs Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: tpm-tis Apr 17 22:04:40 user nova-compute[71972]: tpm-crb Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: passthrough Apr 17 22:04:40 user nova-compute[71972]: emulator Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: {{(pid=71972) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Getting domain capabilities for m68k via machine types: {None, 'virt'} {{(pid=71972) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Error from libvirt when retrieving domain capabilities for arch m68k / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-m68k' on this host {{(pid=71972) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Error from libvirt when retrieving domain capabilities for arch m68k / virt_type kvm / machine_type virt: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-m68k' on this host {{(pid=71972) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Getting domain capabilities for microblaze via machine types: {None} {{(pid=71972) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Error from libvirt when retrieving domain capabilities for arch microblaze / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-microblaze' on this host {{(pid=71972) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Getting domain capabilities for microblazeel via machine types: {None} {{(pid=71972) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Error from libvirt when retrieving domain capabilities for arch microblazeel / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-microblazeel' on this host {{(pid=71972) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Getting domain capabilities for mips via machine types: {None} {{(pid=71972) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Error from libvirt when retrieving domain capabilities for arch mips / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips' on this host {{(pid=71972) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Getting domain capabilities for mipsel via machine types: {None} {{(pid=71972) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Error from libvirt when retrieving domain capabilities for arch mipsel / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mipsel' on this host {{(pid=71972) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Getting domain capabilities for mips64 via machine types: {None} {{(pid=71972) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Error from libvirt when retrieving domain capabilities for arch mips64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips64' on this host {{(pid=71972) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Getting domain capabilities for mips64el via machine types: {None} {{(pid=71972) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Error from libvirt when retrieving domain capabilities for arch mips64el / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-mips64el' on this host {{(pid=71972) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Getting domain capabilities for ppc via machine types: {None} {{(pid=71972) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Error from libvirt when retrieving domain capabilities for arch ppc / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc' on this host {{(pid=71972) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Getting domain capabilities for ppc64 via machine types: {'powernv', 'pseries', None} {{(pid=71972) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type powernv: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=71972) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type pseries: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=71972) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Error from libvirt when retrieving domain capabilities for arch ppc64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64' on this host {{(pid=71972) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Getting domain capabilities for ppc64le via machine types: {'powernv', 'pseries'} {{(pid=71972) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Error from libvirt when retrieving domain capabilities for arch ppc64le / virt_type kvm / machine_type powernv: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64le' on this host {{(pid=71972) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Error from libvirt when retrieving domain capabilities for arch ppc64le / virt_type kvm / machine_type pseries: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-ppc64le' on this host {{(pid=71972) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Getting domain capabilities for riscv32 via machine types: {None} {{(pid=71972) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Error from libvirt when retrieving domain capabilities for arch riscv32 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-riscv32' on this host {{(pid=71972) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Getting domain capabilities for riscv64 via machine types: {None} {{(pid=71972) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Error from libvirt when retrieving domain capabilities for arch riscv64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-riscv64' on this host {{(pid=71972) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Getting domain capabilities for s390x via machine types: {'s390-ccw-virtio'} {{(pid=71972) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Error from libvirt when retrieving domain capabilities for arch s390x / virt_type kvm / machine_type s390-ccw-virtio: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-s390x' on this host {{(pid=71972) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Getting domain capabilities for sh4 via machine types: {None} {{(pid=71972) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Error from libvirt when retrieving domain capabilities for arch sh4 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sh4' on this host {{(pid=71972) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Getting domain capabilities for sh4eb via machine types: {None} {{(pid=71972) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Error from libvirt when retrieving domain capabilities for arch sh4eb / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sh4eb' on this host {{(pid=71972) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Getting domain capabilities for sparc via machine types: {None} {{(pid=71972) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Error from libvirt when retrieving domain capabilities for arch sparc / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sparc' on this host {{(pid=71972) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Getting domain capabilities for sparc64 via machine types: {None} {{(pid=71972) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Error from libvirt when retrieving domain capabilities for arch sparc64 / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-sparc64' on this host {{(pid=71972) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Getting domain capabilities for x86_64 via machine types: {'ubuntu', 'ubuntu-q35', 'pc', 'q35'} {{(pid=71972) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=ubuntu: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-x86_64 Apr 17 22:04:40 user nova-compute[71972]: kvm Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-jammy Apr 17 22:04:40 user nova-compute[71972]: x86_64 Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: efi Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: rom Apr 17 22:04:40 user nova-compute[71972]: pflash Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: yes Apr 17 22:04:40 user nova-compute[71972]: no Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: no Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: on Apr 17 22:04:40 user nova-compute[71972]: off Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: on Apr 17 22:04:40 user nova-compute[71972]: off Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: IvyBridge-IBRS Apr 17 22:04:40 user nova-compute[71972]: Intel Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: qemu64 Apr 17 22:04:40 user nova-compute[71972]: qemu32 Apr 17 22:04:40 user nova-compute[71972]: phenom Apr 17 22:04:40 user nova-compute[71972]: pentium3 Apr 17 22:04:40 user nova-compute[71972]: pentium2 Apr 17 22:04:40 user nova-compute[71972]: pentium Apr 17 22:04:40 user nova-compute[71972]: n270 Apr 17 22:04:40 user nova-compute[71972]: kvm64 Apr 17 22:04:40 user nova-compute[71972]: kvm32 Apr 17 22:04:40 user nova-compute[71972]: coreduo Apr 17 22:04:40 user nova-compute[71972]: core2duo Apr 17 22:04:40 user nova-compute[71972]: athlon Apr 17 22:04:40 user nova-compute[71972]: Westmere-IBRS Apr 17 22:04:40 user nova-compute[71972]: Westmere Apr 17 22:04:40 user nova-compute[71972]: Snowridge Apr 17 22:04:40 user nova-compute[71972]: Skylake-Server-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Server-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Server Apr 17 22:04:40 user nova-compute[71972]: Skylake-Client-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Client-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Client Apr 17 22:04:40 user nova-compute[71972]: SandyBridge-IBRS Apr 17 22:04:40 user nova-compute[71972]: SandyBridge Apr 17 22:04:40 user nova-compute[71972]: Penryn Apr 17 22:04:40 user nova-compute[71972]: Opteron_G5 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G4 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G3 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G2 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G1 Apr 17 22:04:40 user nova-compute[71972]: Nehalem-IBRS Apr 17 22:04:40 user nova-compute[71972]: Nehalem Apr 17 22:04:40 user nova-compute[71972]: IvyBridge-IBRS Apr 17 22:04:40 user nova-compute[71972]: IvyBridge Apr 17 22:04:40 user nova-compute[71972]: Icelake-Server-noTSX Apr 17 22:04:40 user nova-compute[71972]: Icelake-Server Apr 17 22:04:40 user nova-compute[71972]: Icelake-Client-noTSX Apr 17 22:04:40 user nova-compute[71972]: Icelake-Client Apr 17 22:04:40 user nova-compute[71972]: Haswell-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Haswell-noTSX Apr 17 22:04:40 user nova-compute[71972]: Haswell-IBRS Apr 17 22:04:40 user nova-compute[71972]: Haswell Apr 17 22:04:40 user nova-compute[71972]: EPYC-Rome Apr 17 22:04:40 user nova-compute[71972]: EPYC-Milan Apr 17 22:04:40 user nova-compute[71972]: EPYC-IBPB Apr 17 22:04:40 user nova-compute[71972]: EPYC Apr 17 22:04:40 user nova-compute[71972]: Dhyana Apr 17 22:04:40 user nova-compute[71972]: Cooperlake Apr 17 22:04:40 user nova-compute[71972]: Conroe Apr 17 22:04:40 user nova-compute[71972]: Cascadelake-Server-noTSX Apr 17 22:04:40 user nova-compute[71972]: Cascadelake-Server Apr 17 22:04:40 user nova-compute[71972]: Broadwell-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Broadwell-noTSX Apr 17 22:04:40 user nova-compute[71972]: Broadwell-IBRS Apr 17 22:04:40 user nova-compute[71972]: Broadwell Apr 17 22:04:40 user nova-compute[71972]: 486 Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: file Apr 17 22:04:40 user nova-compute[71972]: anonymous Apr 17 22:04:40 user nova-compute[71972]: memfd Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: disk Apr 17 22:04:40 user nova-compute[71972]: cdrom Apr 17 22:04:40 user nova-compute[71972]: floppy Apr 17 22:04:40 user nova-compute[71972]: lun Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: ide Apr 17 22:04:40 user nova-compute[71972]: fdc Apr 17 22:04:40 user nova-compute[71972]: scsi Apr 17 22:04:40 user nova-compute[71972]: virtio Apr 17 22:04:40 user nova-compute[71972]: usb Apr 17 22:04:40 user nova-compute[71972]: sata Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: virtio Apr 17 22:04:40 user nova-compute[71972]: virtio-transitional Apr 17 22:04:40 user nova-compute[71972]: virtio-non-transitional Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: sdl Apr 17 22:04:40 user nova-compute[71972]: vnc Apr 17 22:04:40 user nova-compute[71972]: spice Apr 17 22:04:40 user nova-compute[71972]: egl-headless Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: subsystem Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: default Apr 17 22:04:40 user nova-compute[71972]: mandatory Apr 17 22:04:40 user nova-compute[71972]: requisite Apr 17 22:04:40 user nova-compute[71972]: optional Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: usb Apr 17 22:04:40 user nova-compute[71972]: pci Apr 17 22:04:40 user nova-compute[71972]: scsi Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: virtio Apr 17 22:04:40 user nova-compute[71972]: virtio-transitional Apr 17 22:04:40 user nova-compute[71972]: virtio-non-transitional Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: random Apr 17 22:04:40 user nova-compute[71972]: egd Apr 17 22:04:40 user nova-compute[71972]: builtin Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: path Apr 17 22:04:40 user nova-compute[71972]: handle Apr 17 22:04:40 user nova-compute[71972]: virtiofs Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: tpm-tis Apr 17 22:04:40 user nova-compute[71972]: tpm-crb Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: passthrough Apr 17 22:04:40 user nova-compute[71972]: emulator Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: {{(pid=71972) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=ubuntu-q35: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-x86_64 Apr 17 22:04:40 user nova-compute[71972]: kvm Apr 17 22:04:40 user nova-compute[71972]: pc-q35-jammy Apr 17 22:04:40 user nova-compute[71972]: x86_64 Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: efi Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: /usr/share/OVMF/OVMF_CODE_4M.ms.fd Apr 17 22:04:40 user nova-compute[71972]: /usr/share/OVMF/OVMF_CODE_4M.secboot.fd Apr 17 22:04:40 user nova-compute[71972]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: rom Apr 17 22:04:40 user nova-compute[71972]: pflash Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: yes Apr 17 22:04:40 user nova-compute[71972]: no Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: yes Apr 17 22:04:40 user nova-compute[71972]: no Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: on Apr 17 22:04:40 user nova-compute[71972]: off Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: on Apr 17 22:04:40 user nova-compute[71972]: off Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: IvyBridge-IBRS Apr 17 22:04:40 user nova-compute[71972]: Intel Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: qemu64 Apr 17 22:04:40 user nova-compute[71972]: qemu32 Apr 17 22:04:40 user nova-compute[71972]: phenom Apr 17 22:04:40 user nova-compute[71972]: pentium3 Apr 17 22:04:40 user nova-compute[71972]: pentium2 Apr 17 22:04:40 user nova-compute[71972]: pentium Apr 17 22:04:40 user nova-compute[71972]: n270 Apr 17 22:04:40 user nova-compute[71972]: kvm64 Apr 17 22:04:40 user nova-compute[71972]: kvm32 Apr 17 22:04:40 user nova-compute[71972]: coreduo Apr 17 22:04:40 user nova-compute[71972]: core2duo Apr 17 22:04:40 user nova-compute[71972]: athlon Apr 17 22:04:40 user nova-compute[71972]: Westmere-IBRS Apr 17 22:04:40 user nova-compute[71972]: Westmere Apr 17 22:04:40 user nova-compute[71972]: Snowridge Apr 17 22:04:40 user nova-compute[71972]: Skylake-Server-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Server-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Server Apr 17 22:04:40 user nova-compute[71972]: Skylake-Client-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Client-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Client Apr 17 22:04:40 user nova-compute[71972]: SandyBridge-IBRS Apr 17 22:04:40 user nova-compute[71972]: SandyBridge Apr 17 22:04:40 user nova-compute[71972]: Penryn Apr 17 22:04:40 user nova-compute[71972]: Opteron_G5 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G4 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G3 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G2 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G1 Apr 17 22:04:40 user nova-compute[71972]: Nehalem-IBRS Apr 17 22:04:40 user nova-compute[71972]: Nehalem Apr 17 22:04:40 user nova-compute[71972]: IvyBridge-IBRS Apr 17 22:04:40 user nova-compute[71972]: IvyBridge Apr 17 22:04:40 user nova-compute[71972]: Icelake-Server-noTSX Apr 17 22:04:40 user nova-compute[71972]: Icelake-Server Apr 17 22:04:40 user nova-compute[71972]: Icelake-Client-noTSX Apr 17 22:04:40 user nova-compute[71972]: Icelake-Client Apr 17 22:04:40 user nova-compute[71972]: Haswell-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Haswell-noTSX Apr 17 22:04:40 user nova-compute[71972]: Haswell-IBRS Apr 17 22:04:40 user nova-compute[71972]: Haswell Apr 17 22:04:40 user nova-compute[71972]: EPYC-Rome Apr 17 22:04:40 user nova-compute[71972]: EPYC-Milan Apr 17 22:04:40 user nova-compute[71972]: EPYC-IBPB Apr 17 22:04:40 user nova-compute[71972]: EPYC Apr 17 22:04:40 user nova-compute[71972]: Dhyana Apr 17 22:04:40 user nova-compute[71972]: Cooperlake Apr 17 22:04:40 user nova-compute[71972]: Conroe Apr 17 22:04:40 user nova-compute[71972]: Cascadelake-Server-noTSX Apr 17 22:04:40 user nova-compute[71972]: Cascadelake-Server Apr 17 22:04:40 user nova-compute[71972]: Broadwell-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Broadwell-noTSX Apr 17 22:04:40 user nova-compute[71972]: Broadwell-IBRS Apr 17 22:04:40 user nova-compute[71972]: Broadwell Apr 17 22:04:40 user nova-compute[71972]: 486 Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: file Apr 17 22:04:40 user nova-compute[71972]: anonymous Apr 17 22:04:40 user nova-compute[71972]: memfd Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: disk Apr 17 22:04:40 user nova-compute[71972]: cdrom Apr 17 22:04:40 user nova-compute[71972]: floppy Apr 17 22:04:40 user nova-compute[71972]: lun Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: fdc Apr 17 22:04:40 user nova-compute[71972]: scsi Apr 17 22:04:40 user nova-compute[71972]: virtio Apr 17 22:04:40 user nova-compute[71972]: usb Apr 17 22:04:40 user nova-compute[71972]: sata Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: virtio Apr 17 22:04:40 user nova-compute[71972]: virtio-transitional Apr 17 22:04:40 user nova-compute[71972]: virtio-non-transitional Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: sdl Apr 17 22:04:40 user nova-compute[71972]: vnc Apr 17 22:04:40 user nova-compute[71972]: spice Apr 17 22:04:40 user nova-compute[71972]: egl-headless Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: subsystem Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: default Apr 17 22:04:40 user nova-compute[71972]: mandatory Apr 17 22:04:40 user nova-compute[71972]: requisite Apr 17 22:04:40 user nova-compute[71972]: optional Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: usb Apr 17 22:04:40 user nova-compute[71972]: pci Apr 17 22:04:40 user nova-compute[71972]: scsi Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: virtio Apr 17 22:04:40 user nova-compute[71972]: virtio-transitional Apr 17 22:04:40 user nova-compute[71972]: virtio-non-transitional Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: random Apr 17 22:04:40 user nova-compute[71972]: egd Apr 17 22:04:40 user nova-compute[71972]: builtin Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: path Apr 17 22:04:40 user nova-compute[71972]: handle Apr 17 22:04:40 user nova-compute[71972]: virtiofs Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: tpm-tis Apr 17 22:04:40 user nova-compute[71972]: tpm-crb Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: passthrough Apr 17 22:04:40 user nova-compute[71972]: emulator Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: {{(pid=71972) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-x86_64 Apr 17 22:04:40 user nova-compute[71972]: kvm Apr 17 22:04:40 user nova-compute[71972]: pc-i440fx-6.2 Apr 17 22:04:40 user nova-compute[71972]: x86_64 Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: efi Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: rom Apr 17 22:04:40 user nova-compute[71972]: pflash Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: yes Apr 17 22:04:40 user nova-compute[71972]: no Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: no Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: on Apr 17 22:04:40 user nova-compute[71972]: off Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: on Apr 17 22:04:40 user nova-compute[71972]: off Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: IvyBridge-IBRS Apr 17 22:04:40 user nova-compute[71972]: Intel Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: qemu64 Apr 17 22:04:40 user nova-compute[71972]: qemu32 Apr 17 22:04:40 user nova-compute[71972]: phenom Apr 17 22:04:40 user nova-compute[71972]: pentium3 Apr 17 22:04:40 user nova-compute[71972]: pentium2 Apr 17 22:04:40 user nova-compute[71972]: pentium Apr 17 22:04:40 user nova-compute[71972]: n270 Apr 17 22:04:40 user nova-compute[71972]: kvm64 Apr 17 22:04:40 user nova-compute[71972]: kvm32 Apr 17 22:04:40 user nova-compute[71972]: coreduo Apr 17 22:04:40 user nova-compute[71972]: core2duo Apr 17 22:04:40 user nova-compute[71972]: athlon Apr 17 22:04:40 user nova-compute[71972]: Westmere-IBRS Apr 17 22:04:40 user nova-compute[71972]: Westmere Apr 17 22:04:40 user nova-compute[71972]: Snowridge Apr 17 22:04:40 user nova-compute[71972]: Skylake-Server-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Server-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Server Apr 17 22:04:40 user nova-compute[71972]: Skylake-Client-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Client-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Client Apr 17 22:04:40 user nova-compute[71972]: SandyBridge-IBRS Apr 17 22:04:40 user nova-compute[71972]: SandyBridge Apr 17 22:04:40 user nova-compute[71972]: Penryn Apr 17 22:04:40 user nova-compute[71972]: Opteron_G5 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G4 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G3 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G2 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G1 Apr 17 22:04:40 user nova-compute[71972]: Nehalem-IBRS Apr 17 22:04:40 user nova-compute[71972]: Nehalem Apr 17 22:04:40 user nova-compute[71972]: IvyBridge-IBRS Apr 17 22:04:40 user nova-compute[71972]: IvyBridge Apr 17 22:04:40 user nova-compute[71972]: Icelake-Server-noTSX Apr 17 22:04:40 user nova-compute[71972]: Icelake-Server Apr 17 22:04:40 user nova-compute[71972]: Icelake-Client-noTSX Apr 17 22:04:40 user nova-compute[71972]: Icelake-Client Apr 17 22:04:40 user nova-compute[71972]: Haswell-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Haswell-noTSX Apr 17 22:04:40 user nova-compute[71972]: Haswell-IBRS Apr 17 22:04:40 user nova-compute[71972]: Haswell Apr 17 22:04:40 user nova-compute[71972]: EPYC-Rome Apr 17 22:04:40 user nova-compute[71972]: EPYC-Milan Apr 17 22:04:40 user nova-compute[71972]: EPYC-IBPB Apr 17 22:04:40 user nova-compute[71972]: EPYC Apr 17 22:04:40 user nova-compute[71972]: Dhyana Apr 17 22:04:40 user nova-compute[71972]: Cooperlake Apr 17 22:04:40 user nova-compute[71972]: Conroe Apr 17 22:04:40 user nova-compute[71972]: Cascadelake-Server-noTSX Apr 17 22:04:40 user nova-compute[71972]: Cascadelake-Server Apr 17 22:04:40 user nova-compute[71972]: Broadwell-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Broadwell-noTSX Apr 17 22:04:40 user nova-compute[71972]: Broadwell-IBRS Apr 17 22:04:40 user nova-compute[71972]: Broadwell Apr 17 22:04:40 user nova-compute[71972]: 486 Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: file Apr 17 22:04:40 user nova-compute[71972]: anonymous Apr 17 22:04:40 user nova-compute[71972]: memfd Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: disk Apr 17 22:04:40 user nova-compute[71972]: cdrom Apr 17 22:04:40 user nova-compute[71972]: floppy Apr 17 22:04:40 user nova-compute[71972]: lun Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: ide Apr 17 22:04:40 user nova-compute[71972]: fdc Apr 17 22:04:40 user nova-compute[71972]: scsi Apr 17 22:04:40 user nova-compute[71972]: virtio Apr 17 22:04:40 user nova-compute[71972]: usb Apr 17 22:04:40 user nova-compute[71972]: sata Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: virtio Apr 17 22:04:40 user nova-compute[71972]: virtio-transitional Apr 17 22:04:40 user nova-compute[71972]: virtio-non-transitional Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: sdl Apr 17 22:04:40 user nova-compute[71972]: vnc Apr 17 22:04:40 user nova-compute[71972]: spice Apr 17 22:04:40 user nova-compute[71972]: egl-headless Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: subsystem Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: default Apr 17 22:04:40 user nova-compute[71972]: mandatory Apr 17 22:04:40 user nova-compute[71972]: requisite Apr 17 22:04:40 user nova-compute[71972]: optional Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: usb Apr 17 22:04:40 user nova-compute[71972]: pci Apr 17 22:04:40 user nova-compute[71972]: scsi Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: virtio Apr 17 22:04:40 user nova-compute[71972]: virtio-transitional Apr 17 22:04:40 user nova-compute[71972]: virtio-non-transitional Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: random Apr 17 22:04:40 user nova-compute[71972]: egd Apr 17 22:04:40 user nova-compute[71972]: builtin Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: path Apr 17 22:04:40 user nova-compute[71972]: handle Apr 17 22:04:40 user nova-compute[71972]: virtiofs Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: tpm-tis Apr 17 22:04:40 user nova-compute[71972]: tpm-crb Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: passthrough Apr 17 22:04:40 user nova-compute[71972]: emulator Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: {{(pid=71972) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: /usr/bin/qemu-system-x86_64 Apr 17 22:04:40 user nova-compute[71972]: kvm Apr 17 22:04:40 user nova-compute[71972]: pc-q35-6.2 Apr 17 22:04:40 user nova-compute[71972]: x86_64 Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: efi Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: /usr/share/OVMF/OVMF_CODE_4M.ms.fd Apr 17 22:04:40 user nova-compute[71972]: /usr/share/OVMF/OVMF_CODE_4M.secboot.fd Apr 17 22:04:40 user nova-compute[71972]: /usr/share/OVMF/OVMF_CODE_4M.fd Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: rom Apr 17 22:04:40 user nova-compute[71972]: pflash Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: yes Apr 17 22:04:40 user nova-compute[71972]: no Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: yes Apr 17 22:04:40 user nova-compute[71972]: no Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: on Apr 17 22:04:40 user nova-compute[71972]: off Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: on Apr 17 22:04:40 user nova-compute[71972]: off Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: IvyBridge-IBRS Apr 17 22:04:40 user nova-compute[71972]: Intel Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: qemu64 Apr 17 22:04:40 user nova-compute[71972]: qemu32 Apr 17 22:04:40 user nova-compute[71972]: phenom Apr 17 22:04:40 user nova-compute[71972]: pentium3 Apr 17 22:04:40 user nova-compute[71972]: pentium2 Apr 17 22:04:40 user nova-compute[71972]: pentium Apr 17 22:04:40 user nova-compute[71972]: n270 Apr 17 22:04:40 user nova-compute[71972]: kvm64 Apr 17 22:04:40 user nova-compute[71972]: kvm32 Apr 17 22:04:40 user nova-compute[71972]: coreduo Apr 17 22:04:40 user nova-compute[71972]: core2duo Apr 17 22:04:40 user nova-compute[71972]: athlon Apr 17 22:04:40 user nova-compute[71972]: Westmere-IBRS Apr 17 22:04:40 user nova-compute[71972]: Westmere Apr 17 22:04:40 user nova-compute[71972]: Snowridge Apr 17 22:04:40 user nova-compute[71972]: Skylake-Server-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Server-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Server Apr 17 22:04:40 user nova-compute[71972]: Skylake-Client-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Client-IBRS Apr 17 22:04:40 user nova-compute[71972]: Skylake-Client Apr 17 22:04:40 user nova-compute[71972]: SandyBridge-IBRS Apr 17 22:04:40 user nova-compute[71972]: SandyBridge Apr 17 22:04:40 user nova-compute[71972]: Penryn Apr 17 22:04:40 user nova-compute[71972]: Opteron_G5 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G4 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G3 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G2 Apr 17 22:04:40 user nova-compute[71972]: Opteron_G1 Apr 17 22:04:40 user nova-compute[71972]: Nehalem-IBRS Apr 17 22:04:40 user nova-compute[71972]: Nehalem Apr 17 22:04:40 user nova-compute[71972]: IvyBridge-IBRS Apr 17 22:04:40 user nova-compute[71972]: IvyBridge Apr 17 22:04:40 user nova-compute[71972]: Icelake-Server-noTSX Apr 17 22:04:40 user nova-compute[71972]: Icelake-Server Apr 17 22:04:40 user nova-compute[71972]: Icelake-Client-noTSX Apr 17 22:04:40 user nova-compute[71972]: Icelake-Client Apr 17 22:04:40 user nova-compute[71972]: Haswell-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Haswell-noTSX Apr 17 22:04:40 user nova-compute[71972]: Haswell-IBRS Apr 17 22:04:40 user nova-compute[71972]: Haswell Apr 17 22:04:40 user nova-compute[71972]: EPYC-Rome Apr 17 22:04:40 user nova-compute[71972]: EPYC-Milan Apr 17 22:04:40 user nova-compute[71972]: EPYC-IBPB Apr 17 22:04:40 user nova-compute[71972]: EPYC Apr 17 22:04:40 user nova-compute[71972]: Dhyana Apr 17 22:04:40 user nova-compute[71972]: Cooperlake Apr 17 22:04:40 user nova-compute[71972]: Conroe Apr 17 22:04:40 user nova-compute[71972]: Cascadelake-Server-noTSX Apr 17 22:04:40 user nova-compute[71972]: Cascadelake-Server Apr 17 22:04:40 user nova-compute[71972]: Broadwell-noTSX-IBRS Apr 17 22:04:40 user nova-compute[71972]: Broadwell-noTSX Apr 17 22:04:40 user nova-compute[71972]: Broadwell-IBRS Apr 17 22:04:40 user nova-compute[71972]: Broadwell Apr 17 22:04:40 user nova-compute[71972]: 486 Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: file Apr 17 22:04:40 user nova-compute[71972]: anonymous Apr 17 22:04:40 user nova-compute[71972]: memfd Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: disk Apr 17 22:04:40 user nova-compute[71972]: cdrom Apr 17 22:04:40 user nova-compute[71972]: floppy Apr 17 22:04:40 user nova-compute[71972]: lun Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: fdc Apr 17 22:04:40 user nova-compute[71972]: scsi Apr 17 22:04:40 user nova-compute[71972]: virtio Apr 17 22:04:40 user nova-compute[71972]: usb Apr 17 22:04:40 user nova-compute[71972]: sata Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: virtio Apr 17 22:04:40 user nova-compute[71972]: virtio-transitional Apr 17 22:04:40 user nova-compute[71972]: virtio-non-transitional Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: sdl Apr 17 22:04:40 user nova-compute[71972]: vnc Apr 17 22:04:40 user nova-compute[71972]: spice Apr 17 22:04:40 user nova-compute[71972]: egl-headless Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: subsystem Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: default Apr 17 22:04:40 user nova-compute[71972]: mandatory Apr 17 22:04:40 user nova-compute[71972]: requisite Apr 17 22:04:40 user nova-compute[71972]: optional Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: usb Apr 17 22:04:40 user nova-compute[71972]: pci Apr 17 22:04:40 user nova-compute[71972]: scsi Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: virtio Apr 17 22:04:40 user nova-compute[71972]: virtio-transitional Apr 17 22:04:40 user nova-compute[71972]: virtio-non-transitional Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: random Apr 17 22:04:40 user nova-compute[71972]: egd Apr 17 22:04:40 user nova-compute[71972]: builtin Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: path Apr 17 22:04:40 user nova-compute[71972]: handle Apr 17 22:04:40 user nova-compute[71972]: virtiofs Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: tpm-tis Apr 17 22:04:40 user nova-compute[71972]: tpm-crb Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: passthrough Apr 17 22:04:40 user nova-compute[71972]: emulator Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: {{(pid=71972) _get_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:1037}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Getting domain capabilities for xtensa via machine types: {None} {{(pid=71972) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Error from libvirt when retrieving domain capabilities for arch xtensa / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-xtensa' on this host {{(pid=71972) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Getting domain capabilities for xtensaeb via machine types: {None} {{(pid=71972) _get_machine_types /opt/stack/nova/nova/virt/libvirt/host.py:952}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Error from libvirt when retrieving domain capabilities for arch xtensaeb / virt_type kvm / machine_type None: [Error Code 8]: invalid argument: KVM is not supported by '/usr/bin/qemu-system-xtensaeb' on this host {{(pid=71972) _add_to_domain_capabilities /opt/stack/nova/nova/virt/libvirt/host.py:991}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Checking secure boot support for host arch (x86_64) {{(pid=71972) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1750}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Checking secure boot support for host arch (x86_64) {{(pid=71972) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1750}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Checking secure boot support for host arch (x86_64) {{(pid=71972) supports_secure_boot /opt/stack/nova/nova/virt/libvirt/host.py:1750}} Apr 17 22:04:40 user nova-compute[71972]: INFO nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Secure Boot support detected Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] cpu compare xml: Apr 17 22:04:40 user nova-compute[71972]: Nehalem Apr 17 22:04:40 user nova-compute[71972]: Apr 17 22:04:40 user nova-compute[71972]: {{(pid=71972) _compare_cpu /opt/stack/nova/nova/virt/libvirt/driver.py:9996}} Apr 17 22:04:40 user nova-compute[71972]: INFO nova.virt.node [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Generated node identity 40de8fba-171e-4c3d-8cc3-30d210d6a26e Apr 17 22:04:40 user nova-compute[71972]: INFO nova.virt.node [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Wrote node identity 40de8fba-171e-4c3d-8cc3-30d210d6a26e to /opt/stack/data/nova/compute_id Apr 17 22:04:40 user nova-compute[71972]: WARNING nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Compute nodes ['40de8fba-171e-4c3d-8cc3-30d210d6a26e'] for host user were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. Apr 17 22:04:40 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host Apr 17 22:04:40 user nova-compute[71972]: WARNING nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] No compute node record found for host user. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host user could not be found. Apr 17 22:04:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Auditing locally available compute resources for user (node: user) {{(pid=71972) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 22:04:40 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:04:40 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:04:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Hypervisor/Node resource view: name=user free_ram=10848MB free_disk=27.113140106201172GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71972) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:04:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:04:40 user nova-compute[71972]: WARNING nova.compute.resource_tracker [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] No compute node record for user:40de8fba-171e-4c3d-8cc3-30d210d6a26e: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 40de8fba-171e-4c3d-8cc3-30d210d6a26e could not be found. Apr 17 22:04:41 user nova-compute[71972]: INFO nova.compute.resource_tracker [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Compute node record created for user:user with uuid: 40de8fba-171e-4c3d-8cc3-30d210d6a26e Apr 17 22:04:41 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 22:04:41 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 22:04:41 user nova-compute[71972]: INFO nova.scheduler.client.report [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [req-26b073da-e3c9-42e8-b0ad-0b51f0e2db3f] Created resource provider record via placement API for resource provider with UUID 40de8fba-171e-4c3d-8cc3-30d210d6a26e and name user. Apr 17 22:04:41 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] /sys/module/kvm_amd/parameters/sev does not exist {{(pid=71972) _kernel_supports_amd_sev /opt/stack/nova/nova/virt/libvirt/host.py:1766}} Apr 17 22:04:41 user nova-compute[71972]: INFO nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] kernel doesn't support AMD SEV Apr 17 22:04:41 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Updating inventory in ProviderTree for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e with inventory: {'MEMORY_MB': {'total': 16023, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 12, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 40, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 0}} {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 17 22:04:41 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71972) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 22:04:41 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Libvirt baseline CPU Apr 17 22:04:41 user nova-compute[71972]: x86_64 Apr 17 22:04:41 user nova-compute[71972]: Nehalem Apr 17 22:04:41 user nova-compute[71972]: Intel Apr 17 22:04:41 user nova-compute[71972]: Apr 17 22:04:41 user nova-compute[71972]: Apr 17 22:04:41 user nova-compute[71972]: {{(pid=71972) _get_guest_baseline_cpu_features /opt/stack/nova/nova/virt/libvirt/driver.py:12486}} Apr 17 22:04:41 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Updated inventory for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 16023, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 12, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 40, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} Apr 17 22:04:41 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Updating resource provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e generation from 0 to 1 during operation: update_inventory {{(pid=71972) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} Apr 17 22:04:41 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Updating inventory in ProviderTree for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e with inventory: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 17 22:04:41 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Updating resource provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e generation from 1 to 2 during operation: update_traits {{(pid=71972) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} Apr 17 22:04:41 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Compute_service record updated for user:user {{(pid=71972) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 22:04:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.658s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:04:41 user nova-compute[71972]: DEBUG nova.service [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Creating RPC server for service compute {{(pid=71972) start /opt/stack/nova/nova/service.py:182}} Apr 17 22:04:41 user nova-compute[71972]: DEBUG nova.service [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Join ServiceGroup membership for this service compute {{(pid=71972) start /opt/stack/nova/nova/service.py:199}} Apr 17 22:04:41 user nova-compute[71972]: DEBUG nova.servicegroup.drivers.db [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] DB_Driver: join new ServiceGroup member user to the compute group, service = {{(pid=71972) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} Apr 17 22:05:13 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._sync_power_states {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:05:13 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:05:32 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:05:32 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:05:32 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Starting heal instance info cache {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 22:05:32 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Rebuilding the list of instances to heal {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 17 22:05:32 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Didn't find any instances for network info cache update. {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 17 22:05:32 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:05:32 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:05:32 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:05:32 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:05:32 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:05:32 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:05:32 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71972) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 22:05:32 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:05:32 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:05:32 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:05:32 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:05:32 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Auditing locally available compute resources for user (node: user) {{(pid=71972) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 22:05:32 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:05:32 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:05:32 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Hypervisor/Node resource view: name=user free_ram=10182MB free_disk=27.026966094970703GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71972) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 22:05:32 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:05:32 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:05:33 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 22:05:33 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 22:05:33 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:05:33 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:05:33 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Compute_service record updated for user:user {{(pid=71972) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 22:05:33 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.409s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:06:33 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:06:33 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:06:33 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:06:33 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:06:33 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71972) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 22:06:33 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:06:33 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Starting heal instance info cache {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 22:06:33 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Rebuilding the list of instances to heal {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 17 22:06:33 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Didn't find any instances for network info cache update. {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 17 22:06:33 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:06:34 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:06:34 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:06:34 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:06:34 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:06:34 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:06:34 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:06:34 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:06:34 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Auditing locally available compute resources for user (node: user) {{(pid=71972) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 22:06:34 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:06:34 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:06:35 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Hypervisor/Node resource view: name=user free_ram=10240MB free_disk=27.080692291259766GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71972) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 22:06:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:06:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:06:35 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 22:06:35 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 22:06:35 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:06:35 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:06:35 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Compute_service record updated for user:user {{(pid=71972) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 22:06:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.149s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:07:33 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:07:33 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:07:33 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71972) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 22:07:33 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:07:34 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:07:34 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:07:34 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Starting heal instance info cache {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 22:07:34 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Rebuilding the list of instances to heal {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 17 22:07:34 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Didn't find any instances for network info cache update. {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 17 22:07:34 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:07:34 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:07:36 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:07:36 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:07:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:07:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:07:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:07:36 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Auditing locally available compute resources for user (node: user) {{(pid=71972) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 22:07:36 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:07:36 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:07:36 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Hypervisor/Node resource view: name=user free_ram=10206MB free_disk=26.850261688232422GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71972) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 22:07:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:07:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:07:37 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 22:07:37 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 22:07:37 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:07:37 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:07:37 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Compute_service record updated for user:user {{(pid=71972) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 22:07:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.183s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:08:33 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:08:33 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:08:33 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:08:33 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71972) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 22:08:35 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:08:35 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:08:36 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:08:36 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:08:36 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Starting heal instance info cache {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 22:08:36 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Rebuilding the list of instances to heal {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 17 22:08:36 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Didn't find any instances for network info cache update. {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 17 22:08:36 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:08:36 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:08:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:08:36 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Auditing locally available compute resources for user (node: user) {{(pid=71972) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 22:08:37 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:08:37 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:08:37 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Hypervisor/Node resource view: name=user free_ram=9506MB free_disk=26.879833221435547GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71972) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 22:08:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:37 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 22:08:37 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 22:08:37 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:08:37 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:08:37 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Compute_service record updated for user:user {{(pid=71972) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 22:08:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.134s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:08:38 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:08:52 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Acquiring lock "7bb0580b-727f-4168-9d56-56dcb4fa404e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:52 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "7bb0580b-727f-4168-9d56-56dcb4fa404e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:52 user nova-compute[71972]: DEBUG nova.compute.manager [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Starting instance... {{(pid=71972) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 22:08:52 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:52 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:52 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71972) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 22:08:52 user nova-compute[71972]: INFO nova.compute.claims [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Claim successful on node user Apr 17 22:08:53 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:08:53 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:08:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.335s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:08:53 user nova-compute[71972]: DEBUG nova.compute.manager [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Start building networks asynchronously for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 22:08:53 user nova-compute[71972]: DEBUG nova.compute.manager [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Allocating IP information in the background. {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 22:08:53 user nova-compute[71972]: DEBUG nova.network.neutron [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] allocate_for_instance() {{(pid=71972) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 22:08:53 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 22:08:53 user nova-compute[71972]: DEBUG nova.compute.manager [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Start building block device mappings for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 22:08:53 user nova-compute[71972]: DEBUG nova.compute.manager [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Start spawning the instance on the hypervisor. {{(pid=71972) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 22:08:53 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Creating instance directory {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 22:08:53 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Creating image(s) Apr 17 22:08:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Acquiring lock "/opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "/opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "/opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.003s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:08:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Acquiring lock "fde46b8e739fd6213d1525690f3ccf27384ee720" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:54 user nova-compute[71972]: DEBUG nova.policy [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '51c0b269c97241d9ad122b23af3ca7ea', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f21699c3400842d3a28e71b288a4aaff', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71972) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 22:08:54 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720.part --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:08:54 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquiring lock "3681b009-1a99-4eb8-b189-3fe0647f5d1d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:54 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "3681b009-1a99-4eb8-b189-3fe0647f5d1d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:54 user nova-compute[71972]: DEBUG nova.compute.manager [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Starting instance... {{(pid=71972) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 22:08:54 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Acquiring lock "99cb7131-abb8-41d6-bddd-c3bc943b7678" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:54 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Lock "99cb7131-abb8-41d6-bddd-c3bc943b7678" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:54 user nova-compute[71972]: DEBUG nova.compute.manager [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Starting instance... {{(pid=71972) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 22:08:54 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720.part --force-share --output=json" returned: 0 in 0.176s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:08:54 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Acquiring lock "c4fc7798-567a-4002-b056-6c4f02d0e955" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:54 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Lock "c4fc7798-567a-4002-b056-6c4f02d0e955" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:54 user nova-compute[71972]: DEBUG nova.virt.images [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] 80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f was qcow2, converting to raw {{(pid=71972) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 17 22:08:54 user nova-compute[71972]: DEBUG nova.privsep.utils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71972) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 17 22:08:54 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720.part /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720.converted {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:08:54 user nova-compute[71972]: DEBUG nova.compute.manager [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Starting instance... {{(pid=71972) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 22:08:54 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:54 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:54 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71972) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 22:08:54 user nova-compute[71972]: INFO nova.compute.claims [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Claim successful on node user Apr 17 22:08:54 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720.part /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720.converted" returned: 0 in 0.212s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720.converted --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720.converted --force-share --output=json" returned: 0 in 0.139s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 1.411s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:08:55 user nova-compute[71972]: INFO oslo.privsep.daemon [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova-cpu.conf', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpm5j_3io9/privsep.sock'] Apr 17 22:08:55 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:08:55 user sudo[80750]: stack : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova-cpu.conf --privsep_context nova.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpm5j_3io9/privsep.sock Apr 17 22:08:55 user sudo[80750]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1001) Apr 17 22:08:55 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.378s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG nova.compute.manager [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Start building networks asynchronously for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.378s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71972) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 22:08:55 user nova-compute[71972]: INFO nova.compute.claims [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Claim successful on node user Apr 17 22:08:55 user nova-compute[71972]: DEBUG nova.compute.manager [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Allocating IP information in the background. {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG nova.network.neutron [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] allocate_for_instance() {{(pid=71972) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 22:08:55 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 22:08:55 user nova-compute[71972]: DEBUG nova.compute.manager [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Start building block device mappings for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG nova.compute.manager [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Start spawning the instance on the hypervisor. {{(pid=71972) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Creating instance directory {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 22:08:55 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Creating image(s) Apr 17 22:08:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquiring lock "/opt/stack/data/nova/instances/3681b009-1a99-4eb8-b189-3fe0647f5d1d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "/opt/stack/data/nova/instances/3681b009-1a99-4eb8-b189-3fe0647f5d1d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "/opt/stack/data/nova/instances/3681b009-1a99-4eb8-b189-3fe0647f5d1d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG nova.policy [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '52b3e35c03b54ae4b5dabfb1325886a9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e52724ed9bc54905bd5eddd8504e4c77', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71972) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.395s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG nova.compute.manager [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Start building networks asynchronously for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.643s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71972) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 22:08:55 user nova-compute[71972]: INFO nova.compute.claims [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Claim successful on node user Apr 17 22:08:55 user nova-compute[71972]: DEBUG nova.compute.manager [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Allocating IP information in the background. {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG nova.network.neutron [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] allocate_for_instance() {{(pid=71972) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 22:08:55 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 22:08:55 user nova-compute[71972]: DEBUG nova.compute.manager [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Start building block device mappings for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG nova.compute.manager [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Start spawning the instance on the hypervisor. {{(pid=71972) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Creating instance directory {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 22:08:55 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Creating image(s) Apr 17 22:08:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Acquiring lock "/opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Lock "/opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Lock "/opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.329s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:08:55 user nova-compute[71972]: DEBUG nova.compute.manager [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Start building networks asynchronously for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 22:08:56 user nova-compute[71972]: DEBUG nova.compute.manager [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Allocating IP information in the background. {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 22:08:56 user nova-compute[71972]: DEBUG nova.network.neutron [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] allocate_for_instance() {{(pid=71972) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 22:08:56 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 22:08:56 user nova-compute[71972]: DEBUG nova.policy [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9be63d1d20854fa28375599715a5ba74', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b99089f4e3074ee7a5c1ada03ceb8984', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71972) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 22:08:56 user nova-compute[71972]: DEBUG nova.compute.manager [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Start building block device mappings for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 22:08:56 user nova-compute[71972]: DEBUG nova.policy [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9ab44d4339554bfba6ac66bebad74413', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '41c0b4d04b1b425db64e6ff5066f1dbe', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71972) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 22:08:56 user nova-compute[71972]: DEBUG nova.compute.manager [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Start spawning the instance on the hypervisor. {{(pid=71972) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 22:08:56 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Creating instance directory {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 22:08:56 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Creating image(s) Apr 17 22:08:56 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Acquiring lock "/opt/stack/data/nova/instances/c4fc7798-567a-4002-b056-6c4f02d0e955/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:56 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Lock "/opt/stack/data/nova/instances/c4fc7798-567a-4002-b056-6c4f02d0e955/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:56 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Lock "/opt/stack/data/nova/instances/c4fc7798-567a-4002-b056-6c4f02d0e955/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:08:56 user nova-compute[71972]: DEBUG nova.network.neutron [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Successfully created port: 06548a29-a501-4b57-97f1-8afe930c8463 {{(pid=71972) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 22:08:56 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Acquiring lock "9634492c-168d-4b49-941a-b89703571b73" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:56 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "9634492c-168d-4b49-941a-b89703571b73" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:56 user nova-compute[71972]: DEBUG nova.compute.manager [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Starting instance... {{(pid=71972) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 22:08:56 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:56 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:56 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71972) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 22:08:56 user nova-compute[71972]: INFO nova.compute.claims [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Claim successful on node user Apr 17 22:08:56 user sudo[80750]: pam_unix(sudo:session): session closed for user root Apr 17 22:08:56 user nova-compute[71972]: INFO oslo.privsep.daemon [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Spawned new privsep daemon via rootwrap Apr 17 22:08:56 user nova-compute[71972]: INFO oslo.privsep.daemon [-] privsep daemon starting Apr 17 22:08:56 user nova-compute[71972]: INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 Apr 17 22:08:56 user nova-compute[71972]: INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none Apr 17 22:08:56 user nova-compute[71972]: INFO oslo.privsep.daemon [-] privsep daemon running as pid 80753 Apr 17 22:08:56 user nova-compute[71972]: WARNING oslo_privsep.priv_context [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] privsep daemon already running Apr 17 22:08:56 user nova-compute[71972]: WARNING oslo_privsep.priv_context [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] privsep daemon already running Apr 17 22:08:56 user nova-compute[71972]: WARNING oslo_privsep.priv_context [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] privsep daemon already running Apr 17 22:08:57 user nova-compute[71972]: DEBUG nova.network.neutron [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Successfully created port: 4de27111-4afc-4732-88b3-2485c4f254e8 {{(pid=71972) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.144s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Acquiring lock "fde46b8e739fd6213d1525690f3ccf27384ee720" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.003s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.219s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquiring lock "fde46b8e739fd6213d1525690f3ccf27384ee720" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.159s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Acquiring lock "fde46b8e739fd6213d1525690f3ccf27384ee720" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.584s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG nova.compute.manager [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Start building networks asynchronously for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.223s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Acquiring lock "fde46b8e739fd6213d1525690f3ccf27384ee720" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.156s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk 1073741824 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG nova.compute.manager [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Allocating IP information in the background. {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG nova.network.neutron [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] allocate_for_instance() {{(pid=71972) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk 1073741824" returned: 0 in 0.075s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:08:57 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.247s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.175s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG nova.compute.manager [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Start building block device mappings for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.164s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Checking if we can resize image /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk. size=1073741824 {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.157s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/3681b009-1a99-4eb8-b189-3fe0647f5d1d/disk 1073741824 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG nova.compute.manager [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Start spawning the instance on the hypervisor. {{(pid=71972) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Creating instance directory {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 22:08:57 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Creating image(s) Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Acquiring lock "/opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "/opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "/opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.008s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/3681b009-1a99-4eb8-b189-3fe0647f5d1d/disk 1073741824" returned: 0 in 0.089s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.254s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.445s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json" returned: 0 in 0.190s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Cannot resize image /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk to a smaller size. {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG nova.objects.instance [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Lazy-loading 'migration_context' on Instance uuid 99cb7131-abb8-41d6-bddd-c3bc943b7678 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Created local disks {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Ensure instance console log exists: /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/console.log {{(pid=71972) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.137s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Acquiring lock "fde46b8e739fd6213d1525690f3ccf27384ee720" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.141s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Checking if we can resize image /opt/stack/data/nova/instances/3681b009-1a99-4eb8-b189-3fe0647f5d1d/disk. size=1073741824 {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3681b009-1a99-4eb8-b189-3fe0647f5d1d/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG nova.policy [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b3e3003057e7456c933b762412442a3e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a863c30ce3844f0ba754b048c2833fa3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71972) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.160s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:08:57 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/c4fc7798-567a-4002-b056-6c4f02d0e955/disk 1073741824 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/c4fc7798-567a-4002-b056-6c4f02d0e955/disk 1073741824" returned: 0 in 0.049s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.217s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.663s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3681b009-1a99-4eb8-b189-3fe0647f5d1d/disk --force-share --output=json" returned: 0 in 0.167s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Cannot resize image /opt/stack/data/nova/instances/3681b009-1a99-4eb8-b189-3fe0647f5d1d/disk to a smaller size. {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG nova.objects.instance [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lazy-loading 'migration_context' on Instance uuid 3681b009-1a99-4eb8-b189-3fe0647f5d1d {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Created local disks {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Ensure instance console log exists: /opt/stack/data/nova/instances/3681b009-1a99-4eb8-b189-3fe0647f5d1d/console.log {{(pid=71972) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.150s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Checking if we can resize image /opt/stack/data/nova/instances/c4fc7798-567a-4002-b056-6c4f02d0e955/disk. size=1073741824 {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4fc7798-567a-4002-b056-6c4f02d0e955/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.150s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk 1073741824 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk 1073741824" returned: 0 in 0.043s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.200s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.408s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4fc7798-567a-4002-b056-6c4f02d0e955/disk --force-share --output=json" returned: 0 in 0.162s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Cannot resize image /opt/stack/data/nova/instances/c4fc7798-567a-4002-b056-6c4f02d0e955/disk to a smaller size. {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG nova.objects.instance [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Lazy-loading 'migration_context' on Instance uuid c4fc7798-567a-4002-b056-6c4f02d0e955 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Created local disks {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Ensure instance console log exists: /opt/stack/data/nova/instances/c4fc7798-567a-4002-b056-6c4f02d0e955/console.log {{(pid=71972) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.131s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Checking if we can resize image /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk. size=1073741824 {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.142s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/disk 1073741824 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/disk 1073741824" returned: 0 in 0.056s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.216s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json" returned: 0 in 0.159s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Cannot resize image /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk to a smaller size. {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG nova.objects.instance [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lazy-loading 'migration_context' on Instance uuid 7bb0580b-727f-4168-9d56-56dcb4fa404e {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Created local disks {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Ensure instance console log exists: /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/console.log {{(pid=71972) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.130s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Checking if we can resize image /opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/disk. size=1073741824 {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/disk --force-share --output=json" returned: 0 in 0.169s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Cannot resize image /opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/disk to a smaller size. {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG nova.objects.instance [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lazy-loading 'migration_context' on Instance uuid 9634492c-168d-4b49-941a-b89703571b73 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Created local disks {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Ensure instance console log exists: /opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/console.log {{(pid=71972) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:58 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:08:59 user nova-compute[71972]: DEBUG nova.network.neutron [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Successfully created port: 7557dbee-f2e2-47a0-88eb-1377350f8504 {{(pid=71972) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 22:08:59 user nova-compute[71972]: DEBUG nova.network.neutron [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Successfully created port: aede8066-45b3-4414-98a0-50dda5a4ee66 {{(pid=71972) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 22:08:59 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Acquiring lock "97067629-e099-49fd-bb79-223dd4401405" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:59 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "97067629-e099-49fd-bb79-223dd4401405" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:59 user nova-compute[71972]: DEBUG nova.compute.manager [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Starting instance... {{(pid=71972) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 22:08:59 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:08:59 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:08:59 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71972) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 22:08:59 user nova-compute[71972]: INFO nova.compute.claims [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Claim successful on node user Apr 17 22:08:59 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:08:59 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:08:59 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.346s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:08:59 user nova-compute[71972]: DEBUG nova.compute.manager [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Start building networks asynchronously for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 22:08:59 user nova-compute[71972]: DEBUG nova.compute.manager [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Allocating IP information in the background. {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 22:08:59 user nova-compute[71972]: DEBUG nova.network.neutron [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] allocate_for_instance() {{(pid=71972) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 22:08:59 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 22:08:59 user nova-compute[71972]: DEBUG nova.compute.manager [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Start building block device mappings for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG nova.compute.manager [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Start spawning the instance on the hypervisor. {{(pid=71972) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Creating instance directory {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 22:09:00 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Creating image(s) Apr 17 22:09:00 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Acquiring lock "/opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "/opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "/opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG nova.policy [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b3e3003057e7456c933b762412442a3e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a863c30ce3844f0ba754b048c2833fa3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71972) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.136s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Acquiring lock "fde46b8e739fd6213d1525690f3ccf27384ee720" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG nova.network.neutron [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Successfully created port: d22f6b6c-44f0-472e-b05f-192e12d56f32 {{(pid=71972) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.145s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk 1073741824 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk 1073741824" returned: 0 in 0.051s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.201s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.139s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Checking if we can resize image /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk. size=1073741824 {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Cannot resize image /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk to a smaller size. {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG nova.objects.instance [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lazy-loading 'migration_context' on Instance uuid 97067629-e099-49fd-bb79-223dd4401405 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Created local disks {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Ensure instance console log exists: /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/console.log {{(pid=71972) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.002s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG nova.network.neutron [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Successfully updated port: 06548a29-a501-4b57-97f1-8afe930c8463 {{(pid=71972) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Acquiring lock "refresh_cache-7bb0580b-727f-4168-9d56-56dcb4fa404e" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Acquired lock "refresh_cache-7bb0580b-727f-4168-9d56-56dcb4fa404e" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:09:00 user nova-compute[71972]: DEBUG nova.network.neutron [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Building network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 22:09:01 user nova-compute[71972]: DEBUG nova.network.neutron [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Instance cache missing network info. {{(pid=71972) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 22:09:01 user nova-compute[71972]: DEBUG nova.compute.manager [req-1beaed43-b800-4156-8367-06323e3435d6 req-aef6a312-be9a-4305-babb-faac2f7376ba service nova] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Received event network-changed-06548a29-a501-4b57-97f1-8afe930c8463 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:01 user nova-compute[71972]: DEBUG nova.compute.manager [req-1beaed43-b800-4156-8367-06323e3435d6 req-aef6a312-be9a-4305-babb-faac2f7376ba service nova] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Refreshing instance network info cache due to event network-changed-06548a29-a501-4b57-97f1-8afe930c8463. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:09:01 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-1beaed43-b800-4156-8367-06323e3435d6 req-aef6a312-be9a-4305-babb-faac2f7376ba service nova] Acquiring lock "refresh_cache-7bb0580b-727f-4168-9d56-56dcb4fa404e" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:09:01 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Acquiring lock "cbaa3995-f00d-4194-b7e2-29bfc6e27614" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:01 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "cbaa3995-f00d-4194-b7e2-29bfc6e27614" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:01 user nova-compute[71972]: DEBUG nova.compute.manager [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Starting instance... {{(pid=71972) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 22:09:01 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:01 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:01 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71972) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 22:09:01 user nova-compute[71972]: INFO nova.compute.claims [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Claim successful on node user Apr 17 22:09:01 user nova-compute[71972]: DEBUG nova.network.neutron [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Successfully updated port: 4de27111-4afc-4732-88b3-2485c4f254e8 {{(pid=71972) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 22:09:01 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Acquiring lock "refresh_cache-c4fc7798-567a-4002-b056-6c4f02d0e955" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:09:01 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Acquired lock "refresh_cache-c4fc7798-567a-4002-b056-6c4f02d0e955" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:09:01 user nova-compute[71972]: DEBUG nova.network.neutron [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Building network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 22:09:01 user nova-compute[71972]: DEBUG nova.network.neutron [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Instance cache missing network info. {{(pid=71972) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 22:09:01 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:09:01 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:09:01 user nova-compute[71972]: DEBUG nova.network.neutron [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Successfully created port: 63dc9a41-e89e-4673-a658-7acddd88706f {{(pid=71972) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 22:09:01 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.548s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:01 user nova-compute[71972]: DEBUG nova.compute.manager [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Start building networks asynchronously for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.compute.manager [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Allocating IP information in the background. {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.network.neutron [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] allocate_for_instance() {{(pid=71972) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 22:09:02 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.compute.manager [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Start building block device mappings for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.network.neutron [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Successfully updated port: 7557dbee-f2e2-47a0-88eb-1377350f8504 {{(pid=71972) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquiring lock "refresh_cache-3681b009-1a99-4eb8-b189-3fe0647f5d1d" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquired lock "refresh_cache-3681b009-1a99-4eb8-b189-3fe0647f5d1d" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.network.neutron [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Building network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.compute.manager [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Start spawning the instance on the hypervisor. {{(pid=71972) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Creating instance directory {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 22:09:02 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Creating image(s) Apr 17 22:09:02 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Acquiring lock "/opt/stack/data/nova/instances/cbaa3995-f00d-4194-b7e2-29bfc6e27614/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "/opt/stack/data/nova/instances/cbaa3995-f00d-4194-b7e2-29bfc6e27614/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "/opt/stack/data/nova/instances/cbaa3995-f00d-4194-b7e2-29bfc6e27614/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.policy [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a5ec05fe7d7244368c7eec3739a96c19', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9283fe3c9a094f9bbddb08e48973da44', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71972) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.network.neutron [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Updating instance_info_cache with network_info: [{"id": "06548a29-a501-4b57-97f1-8afe930c8463", "address": "fa:16:3e:d4:82:7c", "network": {"id": "3bbe159c-29cd-4095-9556-8169500b1716", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-840222337-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f21699c3400842d3a28e71b288a4aaff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap06548a29-a5", "ovs_interfaceid": "06548a29-a501-4b57-97f1-8afe930c8463", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Releasing lock "refresh_cache-7bb0580b-727f-4168-9d56-56dcb4fa404e" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.compute.manager [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Instance network_info: |[{"id": "06548a29-a501-4b57-97f1-8afe930c8463", "address": "fa:16:3e:d4:82:7c", "network": {"id": "3bbe159c-29cd-4095-9556-8169500b1716", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-840222337-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f21699c3400842d3a28e71b288a4aaff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap06548a29-a5", "ovs_interfaceid": "06548a29-a501-4b57-97f1-8afe930c8463", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-1beaed43-b800-4156-8367-06323e3435d6 req-aef6a312-be9a-4305-babb-faac2f7376ba service nova] Acquired lock "refresh_cache-7bb0580b-727f-4168-9d56-56dcb4fa404e" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.network.neutron [req-1beaed43-b800-4156-8367-06323e3435d6 req-aef6a312-be9a-4305-babb-faac2f7376ba service nova] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Refreshing network info cache for port 06548a29-a501-4b57-97f1-8afe930c8463 {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Start _get_guest_xml network_info=[{"id": "06548a29-a501-4b57-97f1-8afe930c8463", "address": "fa:16:3e:d4:82:7c", "network": {"id": "3bbe159c-29cd-4095-9556-8169500b1716", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-840222337-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f21699c3400842d3a28e71b288a4aaff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap06548a29-a5", "ovs_interfaceid": "06548a29-a501-4b57-97f1-8afe930c8463", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'image_id': '80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 22:09:02 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:09:02 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71972) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T22:06:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=), allow threads: True {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Flavor limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Image limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Flavor pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Image pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Got 1 possible topologies {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.privsep.utils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71972) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:08:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1710989635',display_name='tempest-ServersNegativeTestJSON-server-1710989635',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1710989635',id=1,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f21699c3400842d3a28e71b288a4aaff',ramdisk_id='',reservation_id='r-jifig91e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1844623378',owner_user_name='tempest-ServersNegativeTestJSON-1844623378-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:08:54Z,user_data=None,user_id='51c0b269c97241d9ad122b23af3ca7ea',uuid=7bb0580b-727f-4168-9d56-56dcb4fa404e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "06548a29-a501-4b57-97f1-8afe930c8463", "address": "fa:16:3e:d4:82:7c", "network": {"id": "3bbe159c-29cd-4095-9556-8169500b1716", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-840222337-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f21699c3400842d3a28e71b288a4aaff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap06548a29-a5", "ovs_interfaceid": "06548a29-a501-4b57-97f1-8afe930c8463", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71972) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Converting VIF {"id": "06548a29-a501-4b57-97f1-8afe930c8463", "address": "fa:16:3e:d4:82:7c", "network": {"id": "3bbe159c-29cd-4095-9556-8169500b1716", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-840222337-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f21699c3400842d3a28e71b288a4aaff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap06548a29-a5", "ovs_interfaceid": "06548a29-a501-4b57-97f1-8afe930c8463", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:82:7c,bridge_name='br-int',has_traffic_filtering=True,id=06548a29-a501-4b57-97f1-8afe930c8463,network=Network(3bbe159c-29cd-4095-9556-8169500b1716),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06548a29-a5') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.objects.instance [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lazy-loading 'pci_devices' on Instance uuid 7bb0580b-727f-4168-9d56-56dcb4fa404e {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.223s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.compute.manager [req-d1f46f82-fadc-4363-9514-76891fe27c9f req-a2ac3669-e4fb-402c-9d43-e0ae00b132da service nova] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Received event network-changed-7557dbee-f2e2-47a0-88eb-1377350f8504 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.compute.manager [req-d1f46f82-fadc-4363-9514-76891fe27c9f req-a2ac3669-e4fb-402c-9d43-e0ae00b132da service nova] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Refreshing instance network info cache due to event network-changed-7557dbee-f2e2-47a0-88eb-1377350f8504. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-d1f46f82-fadc-4363-9514-76891fe27c9f req-a2ac3669-e4fb-402c-9d43-e0ae00b132da service nova] Acquiring lock "refresh_cache-3681b009-1a99-4eb8-b189-3fe0647f5d1d" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Acquiring lock "fde46b8e739fd6213d1525690f3ccf27384ee720" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.network.neutron [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Updating instance_info_cache with network_info: [{"id": "4de27111-4afc-4732-88b3-2485c4f254e8", "address": "fa:16:3e:4e:c3:49", "network": {"id": "c2d58e09-cb04-4c46-8176-ef70f3e76aa2", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-75580352-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "41c0b4d04b1b425db64e6ff5066f1dbe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de27111-4a", "ovs_interfaceid": "4de27111-4afc-4732-88b3-2485c4f254e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Releasing lock "refresh_cache-c4fc7798-567a-4002-b056-6c4f02d0e955" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.compute.manager [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Instance network_info: |[{"id": "4de27111-4afc-4732-88b3-2485c4f254e8", "address": "fa:16:3e:4e:c3:49", "network": {"id": "c2d58e09-cb04-4c46-8176-ef70f3e76aa2", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-75580352-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "41c0b4d04b1b425db64e6ff5066f1dbe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de27111-4a", "ovs_interfaceid": "4de27111-4afc-4732-88b3-2485c4f254e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Start _get_guest_xml network_info=[{"id": "4de27111-4afc-4732-88b3-2485c4f254e8", "address": "fa:16:3e:4e:c3:49", "network": {"id": "c2d58e09-cb04-4c46-8176-ef70f3e76aa2", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-75580352-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "41c0b4d04b1b425db64e6ff5066f1dbe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de27111-4a", "ovs_interfaceid": "4de27111-4afc-4732-88b3-2485c4f254e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'image_id': '80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 22:09:02 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:09:02 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71972) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T22:06:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=), allow threads: True {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Flavor limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Image limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Flavor pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Image pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Got 1 possible topologies {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:08:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1454701288',display_name='tempest-DeleteServersTestJSON-server-1454701288',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-1454701288',id=4,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='41c0b4d04b1b425db64e6ff5066f1dbe',ramdisk_id='',reservation_id='r-54xld834',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-894860321',owner_user_name='tempest-DeleteServersTestJSON-894860321-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:08:56Z,user_data=None,user_id='9ab44d4339554bfba6ac66bebad74413',uuid=c4fc7798-567a-4002-b056-6c4f02d0e955,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4de27111-4afc-4732-88b3-2485c4f254e8", "address": "fa:16:3e:4e:c3:49", "network": {"id": "c2d58e09-cb04-4c46-8176-ef70f3e76aa2", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-75580352-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "41c0b4d04b1b425db64e6ff5066f1dbe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de27111-4a", "ovs_interfaceid": "4de27111-4afc-4732-88b3-2485c4f254e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71972) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Converting VIF {"id": "4de27111-4afc-4732-88b3-2485c4f254e8", "address": "fa:16:3e:4e:c3:49", "network": {"id": "c2d58e09-cb04-4c46-8176-ef70f3e76aa2", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-75580352-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "41c0b4d04b1b425db64e6ff5066f1dbe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de27111-4a", "ovs_interfaceid": "4de27111-4afc-4732-88b3-2485c4f254e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:c3:49,bridge_name='br-int',has_traffic_filtering=True,id=4de27111-4afc-4732-88b3-2485c4f254e8,network=Network(c2d58e09-cb04-4c46-8176-ef70f3e76aa2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4de27111-4a') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.objects.instance [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Lazy-loading 'pci_devices' on Instance uuid c4fc7798-567a-4002-b056-6c4f02d0e955 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] End _get_guest_xml xml= Apr 17 22:09:02 user nova-compute[71972]: 7bb0580b-727f-4168-9d56-56dcb4fa404e Apr 17 22:09:02 user nova-compute[71972]: instance-00000001 Apr 17 22:09:02 user nova-compute[71972]: 131072 Apr 17 22:09:02 user nova-compute[71972]: 1 Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: tempest-ServersNegativeTestJSON-server-1710989635 Apr 17 22:09:02 user nova-compute[71972]: 2023-04-17 22:09:02 Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: 128 Apr 17 22:09:02 user nova-compute[71972]: 1 Apr 17 22:09:02 user nova-compute[71972]: 0 Apr 17 22:09:02 user nova-compute[71972]: 0 Apr 17 22:09:02 user nova-compute[71972]: 1 Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: tempest-ServersNegativeTestJSON-1844623378-project-member Apr 17 22:09:02 user nova-compute[71972]: tempest-ServersNegativeTestJSON-1844623378 Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: OpenStack Foundation Apr 17 22:09:02 user nova-compute[71972]: OpenStack Nova Apr 17 22:09:02 user nova-compute[71972]: 0.0.0 Apr 17 22:09:02 user nova-compute[71972]: 7bb0580b-727f-4168-9d56-56dcb4fa404e Apr 17 22:09:02 user nova-compute[71972]: 7bb0580b-727f-4168-9d56-56dcb4fa404e Apr 17 22:09:02 user nova-compute[71972]: Virtual Machine Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: hvm Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Nehalem Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: /dev/urandom Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:08:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1710989635',display_name='tempest-ServersNegativeTestJSON-server-1710989635',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1710989635',id=1,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f21699c3400842d3a28e71b288a4aaff',ramdisk_id='',reservation_id='r-jifig91e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1844623378',owner_user_name='tempest-ServersNegativeTestJSON-1844623378-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:08:54Z,user_data=None,user_id='51c0b269c97241d9ad122b23af3ca7ea',uuid=7bb0580b-727f-4168-9d56-56dcb4fa404e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "06548a29-a501-4b57-97f1-8afe930c8463", "address": "fa:16:3e:d4:82:7c", "network": {"id": "3bbe159c-29cd-4095-9556-8169500b1716", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-840222337-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f21699c3400842d3a28e71b288a4aaff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap06548a29-a5", "ovs_interfaceid": "06548a29-a501-4b57-97f1-8afe930c8463", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Converting VIF {"id": "06548a29-a501-4b57-97f1-8afe930c8463", "address": "fa:16:3e:d4:82:7c", "network": {"id": "3bbe159c-29cd-4095-9556-8169500b1716", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-840222337-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f21699c3400842d3a28e71b288a4aaff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap06548a29-a5", "ovs_interfaceid": "06548a29-a501-4b57-97f1-8afe930c8463", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d4:82:7c,bridge_name='br-int',has_traffic_filtering=True,id=06548a29-a501-4b57-97f1-8afe930c8463,network=Network(3bbe159c-29cd-4095-9556-8169500b1716),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06548a29-a5') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG os_vif [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:82:7c,bridge_name='br-int',has_traffic_filtering=True,id=06548a29-a501-4b57-97f1-8afe930c8463,network=Network(3bbe159c-29cd-4095-9556-8169500b1716),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06548a29-a5') {{(pid=71972) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.network.neutron [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Instance cache missing network info. {{(pid=71972) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] End _get_guest_xml xml= Apr 17 22:09:02 user nova-compute[71972]: c4fc7798-567a-4002-b056-6c4f02d0e955 Apr 17 22:09:02 user nova-compute[71972]: instance-00000004 Apr 17 22:09:02 user nova-compute[71972]: 131072 Apr 17 22:09:02 user nova-compute[71972]: 1 Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: tempest-DeleteServersTestJSON-server-1454701288 Apr 17 22:09:02 user nova-compute[71972]: 2023-04-17 22:09:02 Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: 128 Apr 17 22:09:02 user nova-compute[71972]: 1 Apr 17 22:09:02 user nova-compute[71972]: 0 Apr 17 22:09:02 user nova-compute[71972]: 0 Apr 17 22:09:02 user nova-compute[71972]: 1 Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: tempest-DeleteServersTestJSON-894860321-project-member Apr 17 22:09:02 user nova-compute[71972]: tempest-DeleteServersTestJSON-894860321 Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: OpenStack Foundation Apr 17 22:09:02 user nova-compute[71972]: OpenStack Nova Apr 17 22:09:02 user nova-compute[71972]: 0.0.0 Apr 17 22:09:02 user nova-compute[71972]: c4fc7798-567a-4002-b056-6c4f02d0e955 Apr 17 22:09:02 user nova-compute[71972]: c4fc7798-567a-4002-b056-6c4f02d0e955 Apr 17 22:09:02 user nova-compute[71972]: Virtual Machine Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: hvm Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Nehalem Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: /dev/urandom Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: Apr 17 22:09:02 user nova-compute[71972]: {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:08:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1454701288',display_name='tempest-DeleteServersTestJSON-server-1454701288',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-1454701288',id=4,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='41c0b4d04b1b425db64e6ff5066f1dbe',ramdisk_id='',reservation_id='r-54xld834',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-894860321',owner_user_name='tempest-DeleteServersTestJSON-894860321-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:08:56Z,user_data=None,user_id='9ab44d4339554bfba6ac66bebad74413',uuid=c4fc7798-567a-4002-b056-6c4f02d0e955,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4de27111-4afc-4732-88b3-2485c4f254e8", "address": "fa:16:3e:4e:c3:49", "network": {"id": "c2d58e09-cb04-4c46-8176-ef70f3e76aa2", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-75580352-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "41c0b4d04b1b425db64e6ff5066f1dbe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de27111-4a", "ovs_interfaceid": "4de27111-4afc-4732-88b3-2485c4f254e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Converting VIF {"id": "4de27111-4afc-4732-88b3-2485c4f254e8", "address": "fa:16:3e:4e:c3:49", "network": {"id": "c2d58e09-cb04-4c46-8176-ef70f3e76aa2", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-75580352-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "41c0b4d04b1b425db64e6ff5066f1dbe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de27111-4a", "ovs_interfaceid": "4de27111-4afc-4732-88b3-2485c4f254e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:c3:49,bridge_name='br-int',has_traffic_filtering=True,id=4de27111-4afc-4732-88b3-2485c4f254e8,network=Network(c2d58e09-cb04-4c46-8176-ef70f3e76aa2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4de27111-4a') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG os_vif [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:c3:49,bridge_name='br-int',has_traffic_filtering=True,id=4de27111-4afc-4732-88b3-2485c4f254e8,network=Network(c2d58e09-cb04-4c46-8176-ef70f3e76aa2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4de27111-4a') {{(pid=71972) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Created schema index Interface.name {{(pid=71972) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Created schema index Port.name {{(pid=71972) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Created schema index Bridge.name {{(pid=71972) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:106}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] tcp:127.0.0.1:6640: entering CONNECTING {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.269s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [POLLIN] on fd 29 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] tcp:127.0.0.1:6640: entering BACKOFF {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] schema index .name already exists {{(pid=71972) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:102}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] schema index .name already exists {{(pid=71972) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:102}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] schema index .name already exists {{(pid=71972) autocreate_indices /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/__init__.py:102}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/cbaa3995-f00d-4194-b7e2-29bfc6e27614/disk 1073741824 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [POLLOUT] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/cbaa3995-f00d-4194-b7e2-29bfc6e27614/disk 1073741824" returned: 0 in 0.099s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.413s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 22:09:02 user nova-compute[71972]: INFO oslo.privsep.daemon [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova-cpu.conf', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpakhag447/privsep.sock'] Apr 17 22:09:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 22:09:02 user sudo[80866]: stack : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova-cpu.conf --privsep_context vif_plug_ovs.privsep.vif_plug --privsep_sock_path /tmp/tmpakhag447/privsep.sock Apr 17 22:09:02 user sudo[80866]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1001) Apr 17 22:09:02 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.125s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Checking if we can resize image /opt/stack/data/nova/instances/cbaa3995-f00d-4194-b7e2-29bfc6e27614/disk. size=1073741824 {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 22:09:02 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cbaa3995-f00d-4194-b7e2-29bfc6e27614/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:03 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cbaa3995-f00d-4194-b7e2-29bfc6e27614/disk --force-share --output=json" returned: 0 in 0.121s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:03 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Cannot resize image /opt/stack/data/nova/instances/cbaa3995-f00d-4194-b7e2-29bfc6e27614/disk to a smaller size. {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 22:09:03 user nova-compute[71972]: DEBUG nova.objects.instance [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lazy-loading 'migration_context' on Instance uuid cbaa3995-f00d-4194-b7e2-29bfc6e27614 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:09:03 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Created local disks {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 22:09:03 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Ensure instance console log exists: /opt/stack/data/nova/instances/cbaa3995-f00d-4194-b7e2-29bfc6e27614/console.log {{(pid=71972) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 22:09:03 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:03 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:03 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:03 user nova-compute[71972]: DEBUG nova.network.neutron [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Successfully updated port: aede8066-45b3-4414-98a0-50dda5a4ee66 {{(pid=71972) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 22:09:03 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Acquiring lock "refresh_cache-99cb7131-abb8-41d6-bddd-c3bc943b7678" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:09:03 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Acquired lock "refresh_cache-99cb7131-abb8-41d6-bddd-c3bc943b7678" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:09:03 user nova-compute[71972]: DEBUG nova.network.neutron [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Building network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 22:09:03 user nova-compute[71972]: DEBUG nova.network.neutron [req-1beaed43-b800-4156-8367-06323e3435d6 req-aef6a312-be9a-4305-babb-faac2f7376ba service nova] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Updated VIF entry in instance network info cache for port 06548a29-a501-4b57-97f1-8afe930c8463. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:09:03 user nova-compute[71972]: DEBUG nova.network.neutron [req-1beaed43-b800-4156-8367-06323e3435d6 req-aef6a312-be9a-4305-babb-faac2f7376ba service nova] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Updating instance_info_cache with network_info: [{"id": "06548a29-a501-4b57-97f1-8afe930c8463", "address": "fa:16:3e:d4:82:7c", "network": {"id": "3bbe159c-29cd-4095-9556-8169500b1716", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-840222337-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f21699c3400842d3a28e71b288a4aaff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap06548a29-a5", "ovs_interfaceid": "06548a29-a501-4b57-97f1-8afe930c8463", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:09:03 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-1beaed43-b800-4156-8367-06323e3435d6 req-aef6a312-be9a-4305-babb-faac2f7376ba service nova] Releasing lock "refresh_cache-7bb0580b-727f-4168-9d56-56dcb4fa404e" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:09:03 user nova-compute[71972]: DEBUG nova.compute.manager [req-3cba0c6c-9bc0-4d8a-8adf-45e8e099515e req-4a66b702-d88e-4a43-8966-407b9bc05c06 service nova] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Received event network-changed-4de27111-4afc-4732-88b3-2485c4f254e8 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:03 user nova-compute[71972]: DEBUG nova.compute.manager [req-3cba0c6c-9bc0-4d8a-8adf-45e8e099515e req-4a66b702-d88e-4a43-8966-407b9bc05c06 service nova] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Refreshing instance network info cache due to event network-changed-4de27111-4afc-4732-88b3-2485c4f254e8. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:09:03 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-3cba0c6c-9bc0-4d8a-8adf-45e8e099515e req-4a66b702-d88e-4a43-8966-407b9bc05c06 service nova] Acquiring lock "refresh_cache-c4fc7798-567a-4002-b056-6c4f02d0e955" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:09:03 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-3cba0c6c-9bc0-4d8a-8adf-45e8e099515e req-4a66b702-d88e-4a43-8966-407b9bc05c06 service nova] Acquired lock "refresh_cache-c4fc7798-567a-4002-b056-6c4f02d0e955" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:09:03 user nova-compute[71972]: DEBUG nova.network.neutron [req-3cba0c6c-9bc0-4d8a-8adf-45e8e099515e req-4a66b702-d88e-4a43-8966-407b9bc05c06 service nova] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Refreshing network info cache for port 4de27111-4afc-4732-88b3-2485c4f254e8 {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:09:03 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:03 user nova-compute[71972]: DEBUG nova.network.neutron [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Instance cache missing network info. {{(pid=71972) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.network.neutron [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Successfully created port: 11c4742a-b778-458f-9a76-1a8d6330f415 {{(pid=71972) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 22:09:04 user sudo[80866]: pam_unix(sudo:session): session closed for user root Apr 17 22:09:04 user nova-compute[71972]: INFO oslo.privsep.daemon [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Spawned new privsep daemon via rootwrap Apr 17 22:09:04 user nova-compute[71972]: INFO oslo.privsep.daemon [-] privsep daemon starting Apr 17 22:09:04 user nova-compute[71972]: INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0 Apr 17 22:09:04 user nova-compute[71972]: INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none Apr 17 22:09:04 user nova-compute[71972]: INFO oslo.privsep.daemon [-] privsep daemon running as pid 80874 Apr 17 22:09:04 user nova-compute[71972]: WARNING oslo_privsep.priv_context [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] privsep daemon already running Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.network.neutron [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Updating instance_info_cache with network_info: [{"id": "7557dbee-f2e2-47a0-88eb-1377350f8504", "address": "fa:16:3e:55:dc:d0", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7557dbee-f2", "ovs_interfaceid": "7557dbee-f2e2-47a0-88eb-1377350f8504", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Releasing lock "refresh_cache-3681b009-1a99-4eb8-b189-3fe0647f5d1d" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.compute.manager [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Instance network_info: |[{"id": "7557dbee-f2e2-47a0-88eb-1377350f8504", "address": "fa:16:3e:55:dc:d0", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7557dbee-f2", "ovs_interfaceid": "7557dbee-f2e2-47a0-88eb-1377350f8504", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-d1f46f82-fadc-4363-9514-76891fe27c9f req-a2ac3669-e4fb-402c-9d43-e0ae00b132da service nova] Acquired lock "refresh_cache-3681b009-1a99-4eb8-b189-3fe0647f5d1d" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.network.neutron [req-d1f46f82-fadc-4363-9514-76891fe27c9f req-a2ac3669-e4fb-402c-9d43-e0ae00b132da service nova] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Refreshing network info cache for port 7557dbee-f2e2-47a0-88eb-1377350f8504 {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Start _get_guest_xml network_info=[{"id": "7557dbee-f2e2-47a0-88eb-1377350f8504", "address": "fa:16:3e:55:dc:d0", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7557dbee-f2", "ovs_interfaceid": "7557dbee-f2e2-47a0-88eb-1377350f8504", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'image_id': '80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 22:09:04 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:09:04 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71972) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T22:06:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=), allow threads: True {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Flavor limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Image limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Flavor pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Image pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Got 1 possible topologies {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:08:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1530040262',display_name='tempest-AttachVolumeNegativeTest-server-1530040262',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1530040262',id=2,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCjQzi1kjZQeEQ1rYjOKbLcVcDCRQQIijtqR97gxYJ2Onb6dZq9Ac7P5Uos+0FwBhyMNkY6cGIDdtzKuXupShf31TiuVlUPJpGQ92/3ShzPqtsJ2m3VXUdv5ryHzD1eLpg==',key_name='tempest-keypair-1969874175',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e52724ed9bc54905bd5eddd8504e4c77',ramdisk_id='',reservation_id='r-umipdb5q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-1678965362',owner_user_name='tempest-AttachVolumeNegativeTest-1678965362-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:08:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='52b3e35c03b54ae4b5dabfb1325886a9',uuid=3681b009-1a99-4eb8-b189-3fe0647f5d1d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7557dbee-f2e2-47a0-88eb-1377350f8504", "address": "fa:16:3e:55:dc:d0", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7557dbee-f2", "ovs_interfaceid": "7557dbee-f2e2-47a0-88eb-1377350f8504", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71972) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Converting VIF {"id": "7557dbee-f2e2-47a0-88eb-1377350f8504", "address": "fa:16:3e:55:dc:d0", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7557dbee-f2", "ovs_interfaceid": "7557dbee-f2e2-47a0-88eb-1377350f8504", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:dc:d0,bridge_name='br-int',has_traffic_filtering=True,id=7557dbee-f2e2-47a0-88eb-1377350f8504,network=Network(bd9d53b7-0e22-42c4-8b91-49575cadf44f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7557dbee-f2') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.objects.instance [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lazy-loading 'pci_devices' on Instance uuid 3681b009-1a99-4eb8-b189-3fe0647f5d1d {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] End _get_guest_xml xml= Apr 17 22:09:04 user nova-compute[71972]: 3681b009-1a99-4eb8-b189-3fe0647f5d1d Apr 17 22:09:04 user nova-compute[71972]: instance-00000002 Apr 17 22:09:04 user nova-compute[71972]: 131072 Apr 17 22:09:04 user nova-compute[71972]: 1 Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: tempest-AttachVolumeNegativeTest-server-1530040262 Apr 17 22:09:04 user nova-compute[71972]: 2023-04-17 22:09:04 Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: 128 Apr 17 22:09:04 user nova-compute[71972]: 1 Apr 17 22:09:04 user nova-compute[71972]: 0 Apr 17 22:09:04 user nova-compute[71972]: 0 Apr 17 22:09:04 user nova-compute[71972]: 1 Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: tempest-AttachVolumeNegativeTest-1678965362-project-member Apr 17 22:09:04 user nova-compute[71972]: tempest-AttachVolumeNegativeTest-1678965362 Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: OpenStack Foundation Apr 17 22:09:04 user nova-compute[71972]: OpenStack Nova Apr 17 22:09:04 user nova-compute[71972]: 0.0.0 Apr 17 22:09:04 user nova-compute[71972]: 3681b009-1a99-4eb8-b189-3fe0647f5d1d Apr 17 22:09:04 user nova-compute[71972]: 3681b009-1a99-4eb8-b189-3fe0647f5d1d Apr 17 22:09:04 user nova-compute[71972]: Virtual Machine Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: hvm Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Nehalem Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: /dev/urandom Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: Apr 17 22:09:04 user nova-compute[71972]: {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:08:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1530040262',display_name='tempest-AttachVolumeNegativeTest-server-1530040262',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1530040262',id=2,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCjQzi1kjZQeEQ1rYjOKbLcVcDCRQQIijtqR97gxYJ2Onb6dZq9Ac7P5Uos+0FwBhyMNkY6cGIDdtzKuXupShf31TiuVlUPJpGQ92/3ShzPqtsJ2m3VXUdv5ryHzD1eLpg==',key_name='tempest-keypair-1969874175',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e52724ed9bc54905bd5eddd8504e4c77',ramdisk_id='',reservation_id='r-umipdb5q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-1678965362',owner_user_name='tempest-AttachVolumeNegativeTest-1678965362-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:08:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='52b3e35c03b54ae4b5dabfb1325886a9',uuid=3681b009-1a99-4eb8-b189-3fe0647f5d1d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7557dbee-f2e2-47a0-88eb-1377350f8504", "address": "fa:16:3e:55:dc:d0", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7557dbee-f2", "ovs_interfaceid": "7557dbee-f2e2-47a0-88eb-1377350f8504", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Converting VIF {"id": "7557dbee-f2e2-47a0-88eb-1377350f8504", "address": "fa:16:3e:55:dc:d0", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7557dbee-f2", "ovs_interfaceid": "7557dbee-f2e2-47a0-88eb-1377350f8504", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:dc:d0,bridge_name='br-int',has_traffic_filtering=True,id=7557dbee-f2e2-47a0-88eb-1377350f8504,network=Network(bd9d53b7-0e22-42c4-8b91-49575cadf44f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7557dbee-f2') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG os_vif [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:dc:d0,bridge_name='br-int',has_traffic_filtering=True,id=7557dbee-f2e2-47a0-88eb-1377350f8504,network=Network(bd9d53b7-0e22-42c4-8b91-49575cadf44f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7557dbee-f2') {{(pid=71972) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.network.neutron [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Successfully updated port: 63dc9a41-e89e-4673-a658-7acddd88706f {{(pid=71972) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Acquiring lock "refresh_cache-97067629-e099-49fd-bb79-223dd4401405" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Acquired lock "refresh_cache-97067629-e099-49fd-bb79-223dd4401405" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.network.neutron [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Building network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.compute.manager [req-37ef8341-077e-4cdb-9568-3bf0562da13c req-d750852c-b3d5-49cb-901d-717d3166e631 service nova] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Received event network-changed-aede8066-45b3-4414-98a0-50dda5a4ee66 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.compute.manager [req-37ef8341-077e-4cdb-9568-3bf0562da13c req-d750852c-b3d5-49cb-901d-717d3166e631 service nova] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Refreshing instance network info cache due to event network-changed-aede8066-45b3-4414-98a0-50dda5a4ee66. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-37ef8341-077e-4cdb-9568-3bf0562da13c req-d750852c-b3d5-49cb-901d-717d3166e631 service nova] Acquiring lock "refresh_cache-99cb7131-abb8-41d6-bddd-c3bc943b7678" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.network.neutron [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Instance cache missing network info. {{(pid=71972) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.compute.manager [req-789ed735-38fb-4a15-b522-c74436e92559 req-928a4f01-2ddc-4f59-820a-f8127bf82aea service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] Received event network-changed-63dc9a41-e89e-4673-a658-7acddd88706f {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.compute.manager [req-789ed735-38fb-4a15-b522-c74436e92559 req-928a4f01-2ddc-4f59-820a-f8127bf82aea service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] Refreshing instance network info cache due to event network-changed-63dc9a41-e89e-4673-a658-7acddd88706f. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-789ed735-38fb-4a15-b522-c74436e92559 req-928a4f01-2ddc-4f59-820a-f8127bf82aea service nova] Acquiring lock "refresh_cache-97067629-e099-49fd-bb79-223dd4401405" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.network.neutron [req-3cba0c6c-9bc0-4d8a-8adf-45e8e099515e req-4a66b702-d88e-4a43-8966-407b9bc05c06 service nova] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Updated VIF entry in instance network info cache for port 4de27111-4afc-4732-88b3-2485c4f254e8. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG nova.network.neutron [req-3cba0c6c-9bc0-4d8a-8adf-45e8e099515e req-4a66b702-d88e-4a43-8966-407b9bc05c06 service nova] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Updating instance_info_cache with network_info: [{"id": "4de27111-4afc-4732-88b3-2485c4f254e8", "address": "fa:16:3e:4e:c3:49", "network": {"id": "c2d58e09-cb04-4c46-8176-ef70f3e76aa2", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-75580352-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "41c0b4d04b1b425db64e6ff5066f1dbe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de27111-4a", "ovs_interfaceid": "4de27111-4afc-4732-88b3-2485c4f254e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06548a29-a5, may_exist=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap06548a29-a5, col_values=(('external_ids', {'iface-id': '06548a29-a501-4b57-97f1-8afe930c8463', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d4:82:7c', 'vm-uuid': '7bb0580b-727f-4168-9d56-56dcb4fa404e'}),)) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:04 user nova-compute[71972]: INFO os_vif [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d4:82:7c,bridge_name='br-int',has_traffic_filtering=True,id=06548a29-a501-4b57-97f1-8afe930c8463,network=Network(3bbe159c-29cd-4095-9556-8169500b1716),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06548a29-a5') Apr 17 22:09:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7557dbee-f2, may_exist=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7557dbee-f2, col_values=(('external_ids', {'iface-id': '7557dbee-f2e2-47a0-88eb-1377350f8504', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:55:dc:d0', 'vm-uuid': '3681b009-1a99-4eb8-b189-3fe0647f5d1d'}),)) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-3cba0c6c-9bc0-4d8a-8adf-45e8e099515e req-4a66b702-d88e-4a43-8966-407b9bc05c06 service nova] Releasing lock "refresh_cache-c4fc7798-567a-4002-b056-6c4f02d0e955" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:09:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:05 user nova-compute[71972]: INFO os_vif [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:dc:d0,bridge_name='br-int',has_traffic_filtering=True,id=7557dbee-f2e2-47a0-88eb-1377350f8504,network=Network(bd9d53b7-0e22-42c4-8b91-49575cadf44f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7557dbee-f2') Apr 17 22:09:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4de27111-4a, may_exist=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4de27111-4a, col_values=(('external_ids', {'iface-id': '4de27111-4afc-4732-88b3-2485c4f254e8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4e:c3:49', 'vm-uuid': 'c4fc7798-567a-4002-b056-6c4f02d0e955'}),)) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:05 user nova-compute[71972]: INFO os_vif [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:c3:49,bridge_name='br-int',has_traffic_filtering=True,id=4de27111-4afc-4732-88b3-2485c4f254e8,network=Network(c2d58e09-cb04-4c46-8176-ef70f3e76aa2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4de27111-4a') Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] No BDM found with device name vda, not building metadata. {{(pid=71972) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] No VIF found with MAC fa:16:3e:55:dc:d0, not building metadata {{(pid=71972) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] No BDM found with device name vda, not building metadata. {{(pid=71972) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] No VIF found with MAC fa:16:3e:d4:82:7c, not building metadata {{(pid=71972) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] No BDM found with device name vda, not building metadata. {{(pid=71972) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] No VIF found with MAC fa:16:3e:4e:c3:49, not building metadata {{(pid=71972) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.network.neutron [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Updating instance_info_cache with network_info: [{"id": "aede8066-45b3-4414-98a0-50dda5a4ee66", "address": "fa:16:3e:be:3c:02", "network": {"id": "966f14e1-bfad-4b86-921c-0f8f5ad29a5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1196755439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b99089f4e3074ee7a5c1ada03ceb8984", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaede8066-45", "ovs_interfaceid": "aede8066-45b3-4414-98a0-50dda5a4ee66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Releasing lock "refresh_cache-99cb7131-abb8-41d6-bddd-c3bc943b7678" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.compute.manager [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Instance network_info: |[{"id": "aede8066-45b3-4414-98a0-50dda5a4ee66", "address": "fa:16:3e:be:3c:02", "network": {"id": "966f14e1-bfad-4b86-921c-0f8f5ad29a5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1196755439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b99089f4e3074ee7a5c1ada03ceb8984", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaede8066-45", "ovs_interfaceid": "aede8066-45b3-4414-98a0-50dda5a4ee66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-37ef8341-077e-4cdb-9568-3bf0562da13c req-d750852c-b3d5-49cb-901d-717d3166e631 service nova] Acquired lock "refresh_cache-99cb7131-abb8-41d6-bddd-c3bc943b7678" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.network.neutron [req-37ef8341-077e-4cdb-9568-3bf0562da13c req-d750852c-b3d5-49cb-901d-717d3166e631 service nova] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Refreshing network info cache for port aede8066-45b3-4414-98a0-50dda5a4ee66 {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Start _get_guest_xml network_info=[{"id": "aede8066-45b3-4414-98a0-50dda5a4ee66", "address": "fa:16:3e:be:3c:02", "network": {"id": "966f14e1-bfad-4b86-921c-0f8f5ad29a5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1196755439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b99089f4e3074ee7a5c1ada03ceb8984", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaede8066-45", "ovs_interfaceid": "aede8066-45b3-4414-98a0-50dda5a4ee66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'image_id': '80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 22:09:05 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:09:05 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71972) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T22:06:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=), allow threads: True {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Flavor limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Image limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Flavor pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Image pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Got 1 possible topologies {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:08:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-2094342662',display_name='tempest-ServerActionsTestJSON-server-2094342662',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serveractionstestjson-server-2094342662',id=3,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMIJM6dCvYLXS8VGI2L2wG2Wl5w2/gW9spGN2iclkhvyAOdgyxFfXQNjna17ZfeIKWKVS3RLrdPtDvd/wHntS9qVvu9iFpd3o+fH4gutMbeRa70JnJhZgkKJB4XiFchpJA==',key_name='tempest-keypair-1204015820',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b99089f4e3074ee7a5c1ada03ceb8984',ramdisk_id='',reservation_id='r-1222ez6u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1322112249',owner_user_name='tempest-ServerActionsTestJSON-1322112249-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:08:56Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9be63d1d20854fa28375599715a5ba74',uuid=99cb7131-abb8-41d6-bddd-c3bc943b7678,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "aede8066-45b3-4414-98a0-50dda5a4ee66", "address": "fa:16:3e:be:3c:02", "network": {"id": "966f14e1-bfad-4b86-921c-0f8f5ad29a5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1196755439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b99089f4e3074ee7a5c1ada03ceb8984", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaede8066-45", "ovs_interfaceid": "aede8066-45b3-4414-98a0-50dda5a4ee66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71972) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Converting VIF {"id": "aede8066-45b3-4414-98a0-50dda5a4ee66", "address": "fa:16:3e:be:3c:02", "network": {"id": "966f14e1-bfad-4b86-921c-0f8f5ad29a5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1196755439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b99089f4e3074ee7a5c1ada03ceb8984", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaede8066-45", "ovs_interfaceid": "aede8066-45b3-4414-98a0-50dda5a4ee66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:3c:02,bridge_name='br-int',has_traffic_filtering=True,id=aede8066-45b3-4414-98a0-50dda5a4ee66,network=Network(966f14e1-bfad-4b86-921c-0f8f5ad29a5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaede8066-45') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.objects.instance [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Lazy-loading 'pci_devices' on Instance uuid 99cb7131-abb8-41d6-bddd-c3bc943b7678 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] End _get_guest_xml xml= Apr 17 22:09:05 user nova-compute[71972]: 99cb7131-abb8-41d6-bddd-c3bc943b7678 Apr 17 22:09:05 user nova-compute[71972]: instance-00000003 Apr 17 22:09:05 user nova-compute[71972]: 131072 Apr 17 22:09:05 user nova-compute[71972]: 1 Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: tempest-ServerActionsTestJSON-server-2094342662 Apr 17 22:09:05 user nova-compute[71972]: 2023-04-17 22:09:05 Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: 128 Apr 17 22:09:05 user nova-compute[71972]: 1 Apr 17 22:09:05 user nova-compute[71972]: 0 Apr 17 22:09:05 user nova-compute[71972]: 0 Apr 17 22:09:05 user nova-compute[71972]: 1 Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: tempest-ServerActionsTestJSON-1322112249-project-member Apr 17 22:09:05 user nova-compute[71972]: tempest-ServerActionsTestJSON-1322112249 Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: OpenStack Foundation Apr 17 22:09:05 user nova-compute[71972]: OpenStack Nova Apr 17 22:09:05 user nova-compute[71972]: 0.0.0 Apr 17 22:09:05 user nova-compute[71972]: 99cb7131-abb8-41d6-bddd-c3bc943b7678 Apr 17 22:09:05 user nova-compute[71972]: 99cb7131-abb8-41d6-bddd-c3bc943b7678 Apr 17 22:09:05 user nova-compute[71972]: Virtual Machine Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: hvm Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Nehalem Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: /dev/urandom Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:08:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-2094342662',display_name='tempest-ServerActionsTestJSON-server-2094342662',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serveractionstestjson-server-2094342662',id=3,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMIJM6dCvYLXS8VGI2L2wG2Wl5w2/gW9spGN2iclkhvyAOdgyxFfXQNjna17ZfeIKWKVS3RLrdPtDvd/wHntS9qVvu9iFpd3o+fH4gutMbeRa70JnJhZgkKJB4XiFchpJA==',key_name='tempest-keypair-1204015820',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b99089f4e3074ee7a5c1ada03ceb8984',ramdisk_id='',reservation_id='r-1222ez6u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1322112249',owner_user_name='tempest-ServerActionsTestJSON-1322112249-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:08:56Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9be63d1d20854fa28375599715a5ba74',uuid=99cb7131-abb8-41d6-bddd-c3bc943b7678,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "aede8066-45b3-4414-98a0-50dda5a4ee66", "address": "fa:16:3e:be:3c:02", "network": {"id": "966f14e1-bfad-4b86-921c-0f8f5ad29a5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1196755439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b99089f4e3074ee7a5c1ada03ceb8984", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaede8066-45", "ovs_interfaceid": "aede8066-45b3-4414-98a0-50dda5a4ee66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Converting VIF {"id": "aede8066-45b3-4414-98a0-50dda5a4ee66", "address": "fa:16:3e:be:3c:02", "network": {"id": "966f14e1-bfad-4b86-921c-0f8f5ad29a5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1196755439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b99089f4e3074ee7a5c1ada03ceb8984", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaede8066-45", "ovs_interfaceid": "aede8066-45b3-4414-98a0-50dda5a4ee66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:3c:02,bridge_name='br-int',has_traffic_filtering=True,id=aede8066-45b3-4414-98a0-50dda5a4ee66,network=Network(966f14e1-bfad-4b86-921c-0f8f5ad29a5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaede8066-45') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG os_vif [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:3c:02,bridge_name='br-int',has_traffic_filtering=True,id=aede8066-45b3-4414-98a0-50dda5a4ee66,network=Network(966f14e1-bfad-4b86-921c-0f8f5ad29a5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaede8066-45') {{(pid=71972) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaede8066-45, may_exist=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapaede8066-45, col_values=(('external_ids', {'iface-id': 'aede8066-45b3-4414-98a0-50dda5a4ee66', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:be:3c:02', 'vm-uuid': '99cb7131-abb8-41d6-bddd-c3bc943b7678'}),)) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:05 user nova-compute[71972]: INFO os_vif [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:3c:02,bridge_name='br-int',has_traffic_filtering=True,id=aede8066-45b3-4414-98a0-50dda5a4ee66,network=Network(966f14e1-bfad-4b86-921c-0f8f5ad29a5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaede8066-45') Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] No BDM found with device name vda, not building metadata. {{(pid=71972) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] No VIF found with MAC fa:16:3e:be:3c:02, not building metadata {{(pid=71972) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.network.neutron [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Successfully updated port: d22f6b6c-44f0-472e-b05f-192e12d56f32 {{(pid=71972) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Acquiring lock "refresh_cache-9634492c-168d-4b49-941a-b89703571b73" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Acquired lock "refresh_cache-9634492c-168d-4b49-941a-b89703571b73" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.network.neutron [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Building network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.network.neutron [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Updating instance_info_cache with network_info: [{"id": "63dc9a41-e89e-4673-a658-7acddd88706f", "address": "fa:16:3e:3e:d4:f0", "network": {"id": "bd20802f-022c-4a11-8310-49b2375f642e", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-172869997-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a863c30ce3844f0ba754b048c2833fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap63dc9a41-e8", "ovs_interfaceid": "63dc9a41-e89e-4673-a658-7acddd88706f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Releasing lock "refresh_cache-97067629-e099-49fd-bb79-223dd4401405" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.compute.manager [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Instance network_info: |[{"id": "63dc9a41-e89e-4673-a658-7acddd88706f", "address": "fa:16:3e:3e:d4:f0", "network": {"id": "bd20802f-022c-4a11-8310-49b2375f642e", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-172869997-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a863c30ce3844f0ba754b048c2833fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap63dc9a41-e8", "ovs_interfaceid": "63dc9a41-e89e-4673-a658-7acddd88706f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-789ed735-38fb-4a15-b522-c74436e92559 req-928a4f01-2ddc-4f59-820a-f8127bf82aea service nova] Acquired lock "refresh_cache-97067629-e099-49fd-bb79-223dd4401405" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.network.neutron [req-789ed735-38fb-4a15-b522-c74436e92559 req-928a4f01-2ddc-4f59-820a-f8127bf82aea service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] Refreshing network info cache for port 63dc9a41-e89e-4673-a658-7acddd88706f {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Start _get_guest_xml network_info=[{"id": "63dc9a41-e89e-4673-a658-7acddd88706f", "address": "fa:16:3e:3e:d4:f0", "network": {"id": "bd20802f-022c-4a11-8310-49b2375f642e", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-172869997-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a863c30ce3844f0ba754b048c2833fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap63dc9a41-e8", "ovs_interfaceid": "63dc9a41-e89e-4673-a658-7acddd88706f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'image_id': '80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 22:09:05 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:09:05 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71972) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T22:06:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=), allow threads: True {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Flavor limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Image limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Flavor pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Image pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Got 1 possible topologies {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:08:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1698566818',display_name='tempest-ServerRescueNegativeTestJSON-server-1698566818',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1698566818',id=6,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a863c30ce3844f0ba754b048c2833fa3',ramdisk_id='',reservation_id='r-x70r9ud0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-2008986942',owner_user_name='tempest-ServerRescueNegativeTestJSON-2008986942-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:09:00Z,user_data=None,user_id='b3e3003057e7456c933b762412442a3e',uuid=97067629-e099-49fd-bb79-223dd4401405,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "63dc9a41-e89e-4673-a658-7acddd88706f", "address": "fa:16:3e:3e:d4:f0", "network": {"id": "bd20802f-022c-4a11-8310-49b2375f642e", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-172869997-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a863c30ce3844f0ba754b048c2833fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap63dc9a41-e8", "ovs_interfaceid": "63dc9a41-e89e-4673-a658-7acddd88706f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71972) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Converting VIF {"id": "63dc9a41-e89e-4673-a658-7acddd88706f", "address": "fa:16:3e:3e:d4:f0", "network": {"id": "bd20802f-022c-4a11-8310-49b2375f642e", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-172869997-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a863c30ce3844f0ba754b048c2833fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap63dc9a41-e8", "ovs_interfaceid": "63dc9a41-e89e-4673-a658-7acddd88706f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:d4:f0,bridge_name='br-int',has_traffic_filtering=True,id=63dc9a41-e89e-4673-a658-7acddd88706f,network=Network(bd20802f-022c-4a11-8310-49b2375f642e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63dc9a41-e8') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.objects.instance [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lazy-loading 'pci_devices' on Instance uuid 97067629-e099-49fd-bb79-223dd4401405 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] End _get_guest_xml xml= Apr 17 22:09:05 user nova-compute[71972]: 97067629-e099-49fd-bb79-223dd4401405 Apr 17 22:09:05 user nova-compute[71972]: instance-00000006 Apr 17 22:09:05 user nova-compute[71972]: 131072 Apr 17 22:09:05 user nova-compute[71972]: 1 Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: tempest-ServerRescueNegativeTestJSON-server-1698566818 Apr 17 22:09:05 user nova-compute[71972]: 2023-04-17 22:09:05 Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: 128 Apr 17 22:09:05 user nova-compute[71972]: 1 Apr 17 22:09:05 user nova-compute[71972]: 0 Apr 17 22:09:05 user nova-compute[71972]: 0 Apr 17 22:09:05 user nova-compute[71972]: 1 Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: tempest-ServerRescueNegativeTestJSON-2008986942-project-member Apr 17 22:09:05 user nova-compute[71972]: tempest-ServerRescueNegativeTestJSON-2008986942 Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: OpenStack Foundation Apr 17 22:09:05 user nova-compute[71972]: OpenStack Nova Apr 17 22:09:05 user nova-compute[71972]: 0.0.0 Apr 17 22:09:05 user nova-compute[71972]: 97067629-e099-49fd-bb79-223dd4401405 Apr 17 22:09:05 user nova-compute[71972]: 97067629-e099-49fd-bb79-223dd4401405 Apr 17 22:09:05 user nova-compute[71972]: Virtual Machine Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: hvm Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Nehalem Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: /dev/urandom Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: Apr 17 22:09:05 user nova-compute[71972]: {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:08:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1698566818',display_name='tempest-ServerRescueNegativeTestJSON-server-1698566818',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1698566818',id=6,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a863c30ce3844f0ba754b048c2833fa3',ramdisk_id='',reservation_id='r-x70r9ud0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-2008986942',owner_user_name='tempest-ServerRescueNegativeTestJSON-2008986942-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:09:00Z,user_data=None,user_id='b3e3003057e7456c933b762412442a3e',uuid=97067629-e099-49fd-bb79-223dd4401405,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "63dc9a41-e89e-4673-a658-7acddd88706f", "address": "fa:16:3e:3e:d4:f0", "network": {"id": "bd20802f-022c-4a11-8310-49b2375f642e", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-172869997-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a863c30ce3844f0ba754b048c2833fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap63dc9a41-e8", "ovs_interfaceid": "63dc9a41-e89e-4673-a658-7acddd88706f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Converting VIF {"id": "63dc9a41-e89e-4673-a658-7acddd88706f", "address": "fa:16:3e:3e:d4:f0", "network": {"id": "bd20802f-022c-4a11-8310-49b2375f642e", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-172869997-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a863c30ce3844f0ba754b048c2833fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap63dc9a41-e8", "ovs_interfaceid": "63dc9a41-e89e-4673-a658-7acddd88706f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:d4:f0,bridge_name='br-int',has_traffic_filtering=True,id=63dc9a41-e89e-4673-a658-7acddd88706f,network=Network(bd20802f-022c-4a11-8310-49b2375f642e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63dc9a41-e8') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG os_vif [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:d4:f0,bridge_name='br-int',has_traffic_filtering=True,id=63dc9a41-e89e-4673-a658-7acddd88706f,network=Network(bd20802f-022c-4a11-8310-49b2375f642e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63dc9a41-e8') {{(pid=71972) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap63dc9a41-e8, may_exist=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap63dc9a41-e8, col_values=(('external_ids', {'iface-id': '63dc9a41-e89e-4673-a658-7acddd88706f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3e:d4:f0', 'vm-uuid': '97067629-e099-49fd-bb79-223dd4401405'}),)) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:05 user nova-compute[71972]: INFO os_vif [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:d4:f0,bridge_name='br-int',has_traffic_filtering=True,id=63dc9a41-e89e-4673-a658-7acddd88706f,network=Network(bd20802f-022c-4a11-8310-49b2375f642e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63dc9a41-e8') Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.network.neutron [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Instance cache missing network info. {{(pid=71972) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] No BDM found with device name vda, not building metadata. {{(pid=71972) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 22:09:05 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] No VIF found with MAC fa:16:3e:3e:d4:f0, not building metadata {{(pid=71972) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 22:09:06 user nova-compute[71972]: DEBUG nova.network.neutron [req-d1f46f82-fadc-4363-9514-76891fe27c9f req-a2ac3669-e4fb-402c-9d43-e0ae00b132da service nova] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Updated VIF entry in instance network info cache for port 7557dbee-f2e2-47a0-88eb-1377350f8504. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:09:06 user nova-compute[71972]: DEBUG nova.network.neutron [req-d1f46f82-fadc-4363-9514-76891fe27c9f req-a2ac3669-e4fb-402c-9d43-e0ae00b132da service nova] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Updating instance_info_cache with network_info: [{"id": "7557dbee-f2e2-47a0-88eb-1377350f8504", "address": "fa:16:3e:55:dc:d0", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7557dbee-f2", "ovs_interfaceid": "7557dbee-f2e2-47a0-88eb-1377350f8504", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:09:06 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-d1f46f82-fadc-4363-9514-76891fe27c9f req-a2ac3669-e4fb-402c-9d43-e0ae00b132da service nova] Releasing lock "refresh_cache-3681b009-1a99-4eb8-b189-3fe0647f5d1d" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:09:06 user nova-compute[71972]: DEBUG nova.network.neutron [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Updating instance_info_cache with network_info: [{"id": "d22f6b6c-44f0-472e-b05f-192e12d56f32", "address": "fa:16:3e:ab:f9:49", "network": {"id": "bd20802f-022c-4a11-8310-49b2375f642e", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-172869997-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a863c30ce3844f0ba754b048c2833fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd22f6b6c-44", "ovs_interfaceid": "d22f6b6c-44f0-472e-b05f-192e12d56f32", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:09:06 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Releasing lock "refresh_cache-9634492c-168d-4b49-941a-b89703571b73" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:09:06 user nova-compute[71972]: DEBUG nova.compute.manager [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Instance network_info: |[{"id": "d22f6b6c-44f0-472e-b05f-192e12d56f32", "address": "fa:16:3e:ab:f9:49", "network": {"id": "bd20802f-022c-4a11-8310-49b2375f642e", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-172869997-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a863c30ce3844f0ba754b048c2833fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd22f6b6c-44", "ovs_interfaceid": "d22f6b6c-44f0-472e-b05f-192e12d56f32", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 22:09:06 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Start _get_guest_xml network_info=[{"id": "d22f6b6c-44f0-472e-b05f-192e12d56f32", "address": "fa:16:3e:ab:f9:49", "network": {"id": "bd20802f-022c-4a11-8310-49b2375f642e", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-172869997-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a863c30ce3844f0ba754b048c2833fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd22f6b6c-44", "ovs_interfaceid": "d22f6b6c-44f0-472e-b05f-192e12d56f32", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'image_id': '80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 22:09:06 user nova-compute[71972]: DEBUG nova.compute.manager [req-f47c8f9a-2a82-4fe0-9fb6-11af08d76a47 req-c9fafbc0-90db-4f79-8bca-52abf222f7c4 service nova] [instance: 9634492c-168d-4b49-941a-b89703571b73] Received event network-changed-d22f6b6c-44f0-472e-b05f-192e12d56f32 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:06 user nova-compute[71972]: DEBUG nova.compute.manager [req-f47c8f9a-2a82-4fe0-9fb6-11af08d76a47 req-c9fafbc0-90db-4f79-8bca-52abf222f7c4 service nova] [instance: 9634492c-168d-4b49-941a-b89703571b73] Refreshing instance network info cache due to event network-changed-d22f6b6c-44f0-472e-b05f-192e12d56f32. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:09:06 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-f47c8f9a-2a82-4fe0-9fb6-11af08d76a47 req-c9fafbc0-90db-4f79-8bca-52abf222f7c4 service nova] Acquiring lock "refresh_cache-9634492c-168d-4b49-941a-b89703571b73" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:09:06 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-f47c8f9a-2a82-4fe0-9fb6-11af08d76a47 req-c9fafbc0-90db-4f79-8bca-52abf222f7c4 service nova] Acquired lock "refresh_cache-9634492c-168d-4b49-941a-b89703571b73" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:09:06 user nova-compute[71972]: DEBUG nova.network.neutron [req-f47c8f9a-2a82-4fe0-9fb6-11af08d76a47 req-c9fafbc0-90db-4f79-8bca-52abf222f7c4 service nova] [instance: 9634492c-168d-4b49-941a-b89703571b73] Refreshing network info cache for port d22f6b6c-44f0-472e-b05f-192e12d56f32 {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:09:06 user nova-compute[71972]: DEBUG nova.network.neutron [req-37ef8341-077e-4cdb-9568-3bf0562da13c req-d750852c-b3d5-49cb-901d-717d3166e631 service nova] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Updated VIF entry in instance network info cache for port aede8066-45b3-4414-98a0-50dda5a4ee66. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:09:06 user nova-compute[71972]: DEBUG nova.network.neutron [req-37ef8341-077e-4cdb-9568-3bf0562da13c req-d750852c-b3d5-49cb-901d-717d3166e631 service nova] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Updating instance_info_cache with network_info: [{"id": "aede8066-45b3-4414-98a0-50dda5a4ee66", "address": "fa:16:3e:be:3c:02", "network": {"id": "966f14e1-bfad-4b86-921c-0f8f5ad29a5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1196755439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b99089f4e3074ee7a5c1ada03ceb8984", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaede8066-45", "ovs_interfaceid": "aede8066-45b3-4414-98a0-50dda5a4ee66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:09:06 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-37ef8341-077e-4cdb-9568-3bf0562da13c req-d750852c-b3d5-49cb-901d-717d3166e631 service nova] Releasing lock "refresh_cache-99cb7131-abb8-41d6-bddd-c3bc943b7678" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:09:07 user nova-compute[71972]: DEBUG nova.network.neutron [req-789ed735-38fb-4a15-b522-c74436e92559 req-928a4f01-2ddc-4f59-820a-f8127bf82aea service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] Updated VIF entry in instance network info cache for port 63dc9a41-e89e-4673-a658-7acddd88706f. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:09:07 user nova-compute[71972]: DEBUG nova.network.neutron [req-789ed735-38fb-4a15-b522-c74436e92559 req-928a4f01-2ddc-4f59-820a-f8127bf82aea service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] Updating instance_info_cache with network_info: [{"id": "63dc9a41-e89e-4673-a658-7acddd88706f", "address": "fa:16:3e:3e:d4:f0", "network": {"id": "bd20802f-022c-4a11-8310-49b2375f642e", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-172869997-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a863c30ce3844f0ba754b048c2833fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap63dc9a41-e8", "ovs_interfaceid": "63dc9a41-e89e-4673-a658-7acddd88706f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:09:07 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-789ed735-38fb-4a15-b522-c74436e92559 req-928a4f01-2ddc-4f59-820a-f8127bf82aea service nova] Releasing lock "refresh_cache-97067629-e099-49fd-bb79-223dd4401405" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:09:07 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:07 user nova-compute[71972]: DEBUG nova.network.neutron [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Successfully updated port: 11c4742a-b778-458f-9a76-1a8d6330f415 {{(pid=71972) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 22:09:07 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Acquiring lock "refresh_cache-cbaa3995-f00d-4194-b7e2-29bfc6e27614" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:09:07 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Acquired lock "refresh_cache-cbaa3995-f00d-4194-b7e2-29bfc6e27614" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:09:07 user nova-compute[71972]: DEBUG nova.network.neutron [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Building network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 22:09:07 user nova-compute[71972]: DEBUG nova.compute.manager [req-ae39e72e-1f19-42c1-b967-e1c0accd790a req-d5456996-c543-4840-82ad-c73b0cc1c7e0 service nova] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Received event network-changed-11c4742a-b778-458f-9a76-1a8d6330f415 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:07 user nova-compute[71972]: DEBUG nova.compute.manager [req-ae39e72e-1f19-42c1-b967-e1c0accd790a req-d5456996-c543-4840-82ad-c73b0cc1c7e0 service nova] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Refreshing instance network info cache due to event network-changed-11c4742a-b778-458f-9a76-1a8d6330f415. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:09:07 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-ae39e72e-1f19-42c1-b967-e1c0accd790a req-d5456996-c543-4840-82ad-c73b0cc1c7e0 service nova] Acquiring lock "refresh_cache-cbaa3995-f00d-4194-b7e2-29bfc6e27614" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:09:08 user nova-compute[71972]: DEBUG nova.network.neutron [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Instance cache missing network info. {{(pid=71972) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 22:09:08 user nova-compute[71972]: DEBUG nova.network.neutron [req-f47c8f9a-2a82-4fe0-9fb6-11af08d76a47 req-c9fafbc0-90db-4f79-8bca-52abf222f7c4 service nova] [instance: 9634492c-168d-4b49-941a-b89703571b73] Updated VIF entry in instance network info cache for port d22f6b6c-44f0-472e-b05f-192e12d56f32. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:09:08 user nova-compute[71972]: DEBUG nova.network.neutron [req-f47c8f9a-2a82-4fe0-9fb6-11af08d76a47 req-c9fafbc0-90db-4f79-8bca-52abf222f7c4 service nova] [instance: 9634492c-168d-4b49-941a-b89703571b73] Updating instance_info_cache with network_info: [{"id": "d22f6b6c-44f0-472e-b05f-192e12d56f32", "address": "fa:16:3e:ab:f9:49", "network": {"id": "bd20802f-022c-4a11-8310-49b2375f642e", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-172869997-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a863c30ce3844f0ba754b048c2833fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd22f6b6c-44", "ovs_interfaceid": "d22f6b6c-44f0-472e-b05f-192e12d56f32", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:09:08 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-f47c8f9a-2a82-4fe0-9fb6-11af08d76a47 req-c9fafbc0-90db-4f79-8bca-52abf222f7c4 service nova] Releasing lock "refresh_cache-9634492c-168d-4b49-941a-b89703571b73" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:09:08 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:09 user nova-compute[71972]: DEBUG nova.network.neutron [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Updating instance_info_cache with network_info: [{"id": "11c4742a-b778-458f-9a76-1a8d6330f415", "address": "fa:16:3e:43:2f:41", "network": {"id": "2aae2552-3ef8-41d3-84e6-313da6fc203b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1626967335-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9283fe3c9a094f9bbddb08e48973da44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c4742a-b7", "ovs_interfaceid": "11c4742a-b778-458f-9a76-1a8d6330f415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:09:09 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Releasing lock "refresh_cache-cbaa3995-f00d-4194-b7e2-29bfc6e27614" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:09:09 user nova-compute[71972]: DEBUG nova.compute.manager [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Instance network_info: |[{"id": "11c4742a-b778-458f-9a76-1a8d6330f415", "address": "fa:16:3e:43:2f:41", "network": {"id": "2aae2552-3ef8-41d3-84e6-313da6fc203b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1626967335-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9283fe3c9a094f9bbddb08e48973da44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c4742a-b7", "ovs_interfaceid": "11c4742a-b778-458f-9a76-1a8d6330f415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 22:09:09 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-ae39e72e-1f19-42c1-b967-e1c0accd790a req-d5456996-c543-4840-82ad-c73b0cc1c7e0 service nova] Acquired lock "refresh_cache-cbaa3995-f00d-4194-b7e2-29bfc6e27614" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:09:09 user nova-compute[71972]: DEBUG nova.network.neutron [req-ae39e72e-1f19-42c1-b967-e1c0accd790a req-d5456996-c543-4840-82ad-c73b0cc1c7e0 service nova] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Refreshing network info cache for port 11c4742a-b778-458f-9a76-1a8d6330f415 {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:09:09 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Start _get_guest_xml network_info=[{"id": "11c4742a-b778-458f-9a76-1a8d6330f415", "address": "fa:16:3e:43:2f:41", "network": {"id": "2aae2552-3ef8-41d3-84e6-313da6fc203b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1626967335-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9283fe3c9a094f9bbddb08e48973da44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c4742a-b7", "ovs_interfaceid": "11c4742a-b778-458f-9a76-1a8d6330f415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'image_id': '80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 22:09:09 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:09 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:09 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:10 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:10 user nova-compute[71972]: DEBUG nova.network.neutron [req-ae39e72e-1f19-42c1-b967-e1c0accd790a req-d5456996-c543-4840-82ad-c73b0cc1c7e0 service nova] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Updated VIF entry in instance network info cache for port 11c4742a-b778-458f-9a76-1a8d6330f415. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:09:10 user nova-compute[71972]: DEBUG nova.network.neutron [req-ae39e72e-1f19-42c1-b967-e1c0accd790a req-d5456996-c543-4840-82ad-c73b0cc1c7e0 service nova] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Updating instance_info_cache with network_info: [{"id": "11c4742a-b778-458f-9a76-1a8d6330f415", "address": "fa:16:3e:43:2f:41", "network": {"id": "2aae2552-3ef8-41d3-84e6-313da6fc203b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1626967335-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9283fe3c9a094f9bbddb08e48973da44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c4742a-b7", "ovs_interfaceid": "11c4742a-b778-458f-9a76-1a8d6330f415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:09:10 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:10 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-ae39e72e-1f19-42c1-b967-e1c0accd790a req-d5456996-c543-4840-82ad-c73b0cc1c7e0 service nova] Releasing lock "refresh_cache-cbaa3995-f00d-4194-b7e2-29bfc6e27614" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:09:10 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:10 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:10 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:10 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:10 user nova-compute[71972]: DEBUG nova.compute.manager [req-2435e59d-51f8-4f77-9f99-5368ae7a60c3 req-8fa3c2ea-b606-4c14-abe0-4c937082a28e service nova] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Received event network-vif-plugged-7557dbee-f2e2-47a0-88eb-1377350f8504 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:10 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-2435e59d-51f8-4f77-9f99-5368ae7a60c3 req-8fa3c2ea-b606-4c14-abe0-4c937082a28e service nova] Acquiring lock "3681b009-1a99-4eb8-b189-3fe0647f5d1d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:10 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-2435e59d-51f8-4f77-9f99-5368ae7a60c3 req-8fa3c2ea-b606-4c14-abe0-4c937082a28e service nova] Lock "3681b009-1a99-4eb8-b189-3fe0647f5d1d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:10 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-2435e59d-51f8-4f77-9f99-5368ae7a60c3 req-8fa3c2ea-b606-4c14-abe0-4c937082a28e service nova] Lock "3681b009-1a99-4eb8-b189-3fe0647f5d1d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:10 user nova-compute[71972]: DEBUG nova.compute.manager [req-2435e59d-51f8-4f77-9f99-5368ae7a60c3 req-8fa3c2ea-b606-4c14-abe0-4c937082a28e service nova] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] No waiting events found dispatching network-vif-plugged-7557dbee-f2e2-47a0-88eb-1377350f8504 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:09:10 user nova-compute[71972]: WARNING nova.compute.manager [req-2435e59d-51f8-4f77-9f99-5368ae7a60c3 req-8fa3c2ea-b606-4c14-abe0-4c937082a28e service nova] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Received unexpected event network-vif-plugged-7557dbee-f2e2-47a0-88eb-1377350f8504 for instance with vm_state building and task_state spawning. Apr 17 22:09:10 user nova-compute[71972]: DEBUG nova.compute.manager [req-d14a53bd-f329-41ee-98b7-5147ce824cf5 req-da7a4779-7d82-4cc5-a515-249d8a19f50f service nova] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Received event network-vif-plugged-4de27111-4afc-4732-88b3-2485c4f254e8 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:10 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-d14a53bd-f329-41ee-98b7-5147ce824cf5 req-da7a4779-7d82-4cc5-a515-249d8a19f50f service nova] Acquiring lock "c4fc7798-567a-4002-b056-6c4f02d0e955-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:10 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-d14a53bd-f329-41ee-98b7-5147ce824cf5 req-da7a4779-7d82-4cc5-a515-249d8a19f50f service nova] Lock "c4fc7798-567a-4002-b056-6c4f02d0e955-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:10 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-d14a53bd-f329-41ee-98b7-5147ce824cf5 req-da7a4779-7d82-4cc5-a515-249d8a19f50f service nova] Lock "c4fc7798-567a-4002-b056-6c4f02d0e955-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:10 user nova-compute[71972]: DEBUG nova.compute.manager [req-d14a53bd-f329-41ee-98b7-5147ce824cf5 req-da7a4779-7d82-4cc5-a515-249d8a19f50f service nova] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] No waiting events found dispatching network-vif-plugged-4de27111-4afc-4732-88b3-2485c4f254e8 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:09:10 user nova-compute[71972]: WARNING nova.compute.manager [req-d14a53bd-f329-41ee-98b7-5147ce824cf5 req-da7a4779-7d82-4cc5-a515-249d8a19f50f service nova] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Received unexpected event network-vif-plugged-4de27111-4afc-4732-88b3-2485c4f254e8 for instance with vm_state building and task_state spawning. Apr 17 22:09:10 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:10 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:10 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:10 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:11 user nova-compute[71972]: DEBUG nova.compute.manager [req-4f406267-cd82-4951-8b21-4db78a569da8 req-c563ed63-447b-4280-b5aa-1d378d07afd1 service nova] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Received event network-vif-plugged-aede8066-45b3-4414-98a0-50dda5a4ee66 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:11 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-4f406267-cd82-4951-8b21-4db78a569da8 req-c563ed63-447b-4280-b5aa-1d378d07afd1 service nova] Acquiring lock "99cb7131-abb8-41d6-bddd-c3bc943b7678-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:11 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-4f406267-cd82-4951-8b21-4db78a569da8 req-c563ed63-447b-4280-b5aa-1d378d07afd1 service nova] Lock "99cb7131-abb8-41d6-bddd-c3bc943b7678-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:11 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-4f406267-cd82-4951-8b21-4db78a569da8 req-c563ed63-447b-4280-b5aa-1d378d07afd1 service nova] Lock "99cb7131-abb8-41d6-bddd-c3bc943b7678-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:11 user nova-compute[71972]: DEBUG nova.compute.manager [req-4f406267-cd82-4951-8b21-4db78a569da8 req-c563ed63-447b-4280-b5aa-1d378d07afd1 service nova] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] No waiting events found dispatching network-vif-plugged-aede8066-45b3-4414-98a0-50dda5a4ee66 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:09:11 user nova-compute[71972]: WARNING nova.compute.manager [req-4f406267-cd82-4951-8b21-4db78a569da8 req-c563ed63-447b-4280-b5aa-1d378d07afd1 service nova] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Received unexpected event network-vif-plugged-aede8066-45b3-4414-98a0-50dda5a4ee66 for instance with vm_state building and task_state spawning. Apr 17 22:09:12 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:12 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:12 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:12 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Acquiring lock "b2751b9c-c966-416d-aaaa-81756198849c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:12 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Lock "b2751b9c-c966-416d-aaaa-81756198849c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:12 user nova-compute[71972]: DEBUG nova.compute.manager [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Starting instance... {{(pid=71972) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 22:09:12 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:12 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:12 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:12 user nova-compute[71972]: DEBUG nova.compute.manager [req-871cc8df-703c-446a-8471-d3aea409391b req-ed443353-2251-43f0-8865-23dd28a3fbcf service nova] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Received event network-vif-plugged-7557dbee-f2e2-47a0-88eb-1377350f8504 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:12 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-871cc8df-703c-446a-8471-d3aea409391b req-ed443353-2251-43f0-8865-23dd28a3fbcf service nova] Acquiring lock "3681b009-1a99-4eb8-b189-3fe0647f5d1d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:12 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-871cc8df-703c-446a-8471-d3aea409391b req-ed443353-2251-43f0-8865-23dd28a3fbcf service nova] Lock "3681b009-1a99-4eb8-b189-3fe0647f5d1d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:12 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-871cc8df-703c-446a-8471-d3aea409391b req-ed443353-2251-43f0-8865-23dd28a3fbcf service nova] Lock "3681b009-1a99-4eb8-b189-3fe0647f5d1d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:12 user nova-compute[71972]: DEBUG nova.compute.manager [req-871cc8df-703c-446a-8471-d3aea409391b req-ed443353-2251-43f0-8865-23dd28a3fbcf service nova] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] No waiting events found dispatching network-vif-plugged-7557dbee-f2e2-47a0-88eb-1377350f8504 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:09:12 user nova-compute[71972]: WARNING nova.compute.manager [req-871cc8df-703c-446a-8471-d3aea409391b req-ed443353-2251-43f0-8865-23dd28a3fbcf service nova] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Received unexpected event network-vif-plugged-7557dbee-f2e2-47a0-88eb-1377350f8504 for instance with vm_state building and task_state spawning. Apr 17 22:09:12 user nova-compute[71972]: DEBUG nova.compute.manager [req-7d1112c5-a5f1-4f76-9761-3e73cd884db2 req-6998a135-2f18-4bc5-89dc-085c01e4b5af service nova] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Received event network-vif-plugged-4de27111-4afc-4732-88b3-2485c4f254e8 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:12 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-7d1112c5-a5f1-4f76-9761-3e73cd884db2 req-6998a135-2f18-4bc5-89dc-085c01e4b5af service nova] Acquiring lock "c4fc7798-567a-4002-b056-6c4f02d0e955-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:12 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-7d1112c5-a5f1-4f76-9761-3e73cd884db2 req-6998a135-2f18-4bc5-89dc-085c01e4b5af service nova] Lock "c4fc7798-567a-4002-b056-6c4f02d0e955-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:12 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-7d1112c5-a5f1-4f76-9761-3e73cd884db2 req-6998a135-2f18-4bc5-89dc-085c01e4b5af service nova] Lock "c4fc7798-567a-4002-b056-6c4f02d0e955-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:12 user nova-compute[71972]: DEBUG nova.compute.manager [req-7d1112c5-a5f1-4f76-9761-3e73cd884db2 req-6998a135-2f18-4bc5-89dc-085c01e4b5af service nova] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] No waiting events found dispatching network-vif-plugged-4de27111-4afc-4732-88b3-2485c4f254e8 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:09:12 user nova-compute[71972]: WARNING nova.compute.manager [req-7d1112c5-a5f1-4f76-9761-3e73cd884db2 req-6998a135-2f18-4bc5-89dc-085c01e4b5af service nova] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Received unexpected event network-vif-plugged-4de27111-4afc-4732-88b3-2485c4f254e8 for instance with vm_state building and task_state spawning. Apr 17 22:09:12 user nova-compute[71972]: DEBUG nova.compute.manager [req-7d1112c5-a5f1-4f76-9761-3e73cd884db2 req-6998a135-2f18-4bc5-89dc-085c01e4b5af service nova] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Received event network-vif-plugged-06548a29-a501-4b57-97f1-8afe930c8463 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:12 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-7d1112c5-a5f1-4f76-9761-3e73cd884db2 req-6998a135-2f18-4bc5-89dc-085c01e4b5af service nova] Acquiring lock "7bb0580b-727f-4168-9d56-56dcb4fa404e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:12 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-7d1112c5-a5f1-4f76-9761-3e73cd884db2 req-6998a135-2f18-4bc5-89dc-085c01e4b5af service nova] Lock "7bb0580b-727f-4168-9d56-56dcb4fa404e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:12 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-7d1112c5-a5f1-4f76-9761-3e73cd884db2 req-6998a135-2f18-4bc5-89dc-085c01e4b5af service nova] Lock "7bb0580b-727f-4168-9d56-56dcb4fa404e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:12 user nova-compute[71972]: DEBUG nova.compute.manager [req-7d1112c5-a5f1-4f76-9761-3e73cd884db2 req-6998a135-2f18-4bc5-89dc-085c01e4b5af service nova] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] No waiting events found dispatching network-vif-plugged-06548a29-a501-4b57-97f1-8afe930c8463 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:09:12 user nova-compute[71972]: WARNING nova.compute.manager [req-7d1112c5-a5f1-4f76-9761-3e73cd884db2 req-6998a135-2f18-4bc5-89dc-085c01e4b5af service nova] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Received unexpected event network-vif-plugged-06548a29-a501-4b57-97f1-8afe930c8463 for instance with vm_state building and task_state spawning. Apr 17 22:09:13 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:13 user nova-compute[71972]: DEBUG nova.compute.manager [req-e12957e7-d137-4c11-aaa2-0b22d7942862 req-0618b000-3f46-4e78-a66d-5a492d5fd020 service nova] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Received event network-vif-plugged-aede8066-45b3-4414-98a0-50dda5a4ee66 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:13 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-e12957e7-d137-4c11-aaa2-0b22d7942862 req-0618b000-3f46-4e78-a66d-5a492d5fd020 service nova] Acquiring lock "99cb7131-abb8-41d6-bddd-c3bc943b7678-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:13 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-e12957e7-d137-4c11-aaa2-0b22d7942862 req-0618b000-3f46-4e78-a66d-5a492d5fd020 service nova] Lock "99cb7131-abb8-41d6-bddd-c3bc943b7678-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:13 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-e12957e7-d137-4c11-aaa2-0b22d7942862 req-0618b000-3f46-4e78-a66d-5a492d5fd020 service nova] Lock "99cb7131-abb8-41d6-bddd-c3bc943b7678-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:13 user nova-compute[71972]: DEBUG nova.compute.manager [req-e12957e7-d137-4c11-aaa2-0b22d7942862 req-0618b000-3f46-4e78-a66d-5a492d5fd020 service nova] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] No waiting events found dispatching network-vif-plugged-aede8066-45b3-4414-98a0-50dda5a4ee66 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:09:13 user nova-compute[71972]: WARNING nova.compute.manager [req-e12957e7-d137-4c11-aaa2-0b22d7942862 req-0618b000-3f46-4e78-a66d-5a492d5fd020 service nova] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Received unexpected event network-vif-plugged-aede8066-45b3-4414-98a0-50dda5a4ee66 for instance with vm_state building and task_state spawning. Apr 17 22:09:13 user nova-compute[71972]: DEBUG nova.compute.manager [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Instance event wait completed in 0 seconds for {{(pid=71972) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 22:09:13 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Guest created on hypervisor {{(pid=71972) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 22:09:13 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Resumed> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:09:13 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] VM Resumed (Lifecycle Event) Apr 17 22:09:13 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71972) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 22:09:13 user nova-compute[71972]: INFO nova.compute.claims [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Claim successful on node user Apr 17 22:09:14 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Instance spawned successfully. Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 22:09:14 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:09:14 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71972) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T22:06:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=), allow threads: True {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Flavor limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Image limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Flavor pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Image pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Got 1 possible topologies {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:08:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-364503782',display_name='tempest-ServerRescueNegativeTestJSON-server-364503782',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-364503782',id=5,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a863c30ce3844f0ba754b048c2833fa3',ramdisk_id='',reservation_id='r-x1sgqnl0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-2008986942',owner_user_name='tempest-ServerRescueNegativeTestJSON-2008986942-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:08:58Z,user_data=None,user_id='b3e3003057e7456c933b762412442a3e',uuid=9634492c-168d-4b49-941a-b89703571b73,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d22f6b6c-44f0-472e-b05f-192e12d56f32", "address": "fa:16:3e:ab:f9:49", "network": {"id": "bd20802f-022c-4a11-8310-49b2375f642e", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-172869997-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a863c30ce3844f0ba754b048c2833fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd22f6b6c-44", "ovs_interfaceid": "d22f6b6c-44f0-472e-b05f-192e12d56f32", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71972) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Converting VIF {"id": "d22f6b6c-44f0-472e-b05f-192e12d56f32", "address": "fa:16:3e:ab:f9:49", "network": {"id": "bd20802f-022c-4a11-8310-49b2375f642e", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-172869997-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a863c30ce3844f0ba754b048c2833fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd22f6b6c-44", "ovs_interfaceid": "d22f6b6c-44f0-472e-b05f-192e12d56f32", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:f9:49,bridge_name='br-int',has_traffic_filtering=True,id=d22f6b6c-44f0-472e-b05f-192e12d56f32,network=Network(bd20802f-022c-4a11-8310-49b2375f642e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd22f6b6c-44') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.objects.instance [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lazy-loading 'pci_devices' on Instance uuid 9634492c-168d-4b49-941a-b89703571b73 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:09:14 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:09:14 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71972) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T22:06:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=), allow threads: True {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Flavor limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Image limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Flavor pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Image pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Got 1 possible topologies {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:09:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1596520601',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1596520601',id=7,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIUDw4oqZGwgHOlogieqsyzip0wW30jA5743bGk+uV3e5+U/R9yL7AS0GXvphtOuOOI4CmAsHcasbFw371H+sA0tPQYeuyAIlEJTYvj1WiabqMovvf1nFdGYtWBjTZP1Rw==',key_name='tempest-keypair-1290889891',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9283fe3c9a094f9bbddb08e48973da44',ramdisk_id='',reservation_id='r-pap560xw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1051644628',owner_user_name='tempest-AttachVolumeShelveTestJSON-1051644628-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:09:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a5ec05fe7d7244368c7eec3739a96c19',uuid=cbaa3995-f00d-4194-b7e2-29bfc6e27614,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "11c4742a-b778-458f-9a76-1a8d6330f415", "address": "fa:16:3e:43:2f:41", "network": {"id": "2aae2552-3ef8-41d3-84e6-313da6fc203b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1626967335-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9283fe3c9a094f9bbddb08e48973da44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c4742a-b7", "ovs_interfaceid": "11c4742a-b778-458f-9a76-1a8d6330f415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71972) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Converting VIF {"id": "11c4742a-b778-458f-9a76-1a8d6330f415", "address": "fa:16:3e:43:2f:41", "network": {"id": "2aae2552-3ef8-41d3-84e6-313da6fc203b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1626967335-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9283fe3c9a094f9bbddb08e48973da44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c4742a-b7", "ovs_interfaceid": "11c4742a-b778-458f-9a76-1a8d6330f415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:43:2f:41,bridge_name='br-int',has_traffic_filtering=True,id=11c4742a-b778-458f-9a76-1a8d6330f415,network=Network(2aae2552-3ef8-41d3-84e6-313da6fc203b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11c4742a-b7') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.objects.instance [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lazy-loading 'pci_devices' on Instance uuid cbaa3995-f00d-4194-b7e2-29bfc6e27614 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] End _get_guest_xml xml= Apr 17 22:09:14 user nova-compute[71972]: 9634492c-168d-4b49-941a-b89703571b73 Apr 17 22:09:14 user nova-compute[71972]: instance-00000005 Apr 17 22:09:14 user nova-compute[71972]: 131072 Apr 17 22:09:14 user nova-compute[71972]: 1 Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: tempest-ServerRescueNegativeTestJSON-server-364503782 Apr 17 22:09:14 user nova-compute[71972]: 2023-04-17 22:09:14 Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: 128 Apr 17 22:09:14 user nova-compute[71972]: 1 Apr 17 22:09:14 user nova-compute[71972]: 0 Apr 17 22:09:14 user nova-compute[71972]: 0 Apr 17 22:09:14 user nova-compute[71972]: 1 Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: tempest-ServerRescueNegativeTestJSON-2008986942-project-member Apr 17 22:09:14 user nova-compute[71972]: tempest-ServerRescueNegativeTestJSON-2008986942 Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: OpenStack Foundation Apr 17 22:09:14 user nova-compute[71972]: OpenStack Nova Apr 17 22:09:14 user nova-compute[71972]: 0.0.0 Apr 17 22:09:14 user nova-compute[71972]: 9634492c-168d-4b49-941a-b89703571b73 Apr 17 22:09:14 user nova-compute[71972]: 9634492c-168d-4b49-941a-b89703571b73 Apr 17 22:09:14 user nova-compute[71972]: Virtual Machine Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: hvm Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Nehalem Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: /dev/urandom Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:08:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-364503782',display_name='tempest-ServerRescueNegativeTestJSON-server-364503782',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-364503782',id=5,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a863c30ce3844f0ba754b048c2833fa3',ramdisk_id='',reservation_id='r-x1sgqnl0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-2008986942',owner_user_name='tempest-ServerRescueNegativeTestJSON-2008986942-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:08:58Z,user_data=None,user_id='b3e3003057e7456c933b762412442a3e',uuid=9634492c-168d-4b49-941a-b89703571b73,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d22f6b6c-44f0-472e-b05f-192e12d56f32", "address": "fa:16:3e:ab:f9:49", "network": {"id": "bd20802f-022c-4a11-8310-49b2375f642e", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-172869997-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a863c30ce3844f0ba754b048c2833fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd22f6b6c-44", "ovs_interfaceid": "d22f6b6c-44f0-472e-b05f-192e12d56f32", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Converting VIF {"id": "d22f6b6c-44f0-472e-b05f-192e12d56f32", "address": "fa:16:3e:ab:f9:49", "network": {"id": "bd20802f-022c-4a11-8310-49b2375f642e", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-172869997-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a863c30ce3844f0ba754b048c2833fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd22f6b6c-44", "ovs_interfaceid": "d22f6b6c-44f0-472e-b05f-192e12d56f32", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:f9:49,bridge_name='br-int',has_traffic_filtering=True,id=d22f6b6c-44f0-472e-b05f-192e12d56f32,network=Network(bd20802f-022c-4a11-8310-49b2375f642e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd22f6b6c-44') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG os_vif [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:f9:49,bridge_name='br-int',has_traffic_filtering=True,id=d22f6b6c-44f0-472e-b05f-192e12d56f32,network=Network(bd20802f-022c-4a11-8310-49b2375f642e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd22f6b6c-44') {{(pid=71972) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Found default for hw_cdrom_bus of ide {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Found default for hw_disk_bus of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Found default for hw_input_bus of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Found default for hw_pointer_model of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Found default for hw_video_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Found default for hw_vif_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] End _get_guest_xml xml= Apr 17 22:09:14 user nova-compute[71972]: cbaa3995-f00d-4194-b7e2-29bfc6e27614 Apr 17 22:09:14 user nova-compute[71972]: instance-00000007 Apr 17 22:09:14 user nova-compute[71972]: 131072 Apr 17 22:09:14 user nova-compute[71972]: 1 Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: tempest-AttachVolumeShelveTestJSON-server-1596520601 Apr 17 22:09:14 user nova-compute[71972]: 2023-04-17 22:09:14 Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: 128 Apr 17 22:09:14 user nova-compute[71972]: 1 Apr 17 22:09:14 user nova-compute[71972]: 0 Apr 17 22:09:14 user nova-compute[71972]: 0 Apr 17 22:09:14 user nova-compute[71972]: 1 Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: tempest-AttachVolumeShelveTestJSON-1051644628-project-member Apr 17 22:09:14 user nova-compute[71972]: tempest-AttachVolumeShelveTestJSON-1051644628 Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: OpenStack Foundation Apr 17 22:09:14 user nova-compute[71972]: OpenStack Nova Apr 17 22:09:14 user nova-compute[71972]: 0.0.0 Apr 17 22:09:14 user nova-compute[71972]: cbaa3995-f00d-4194-b7e2-29bfc6e27614 Apr 17 22:09:14 user nova-compute[71972]: cbaa3995-f00d-4194-b7e2-29bfc6e27614 Apr 17 22:09:14 user nova-compute[71972]: Virtual Machine Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: hvm Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Nehalem Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: /dev/urandom Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: Apr 17 22:09:14 user nova-compute[71972]: {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:09:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1596520601',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1596520601',id=7,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIUDw4oqZGwgHOlogieqsyzip0wW30jA5743bGk+uV3e5+U/R9yL7AS0GXvphtOuOOI4CmAsHcasbFw371H+sA0tPQYeuyAIlEJTYvj1WiabqMovvf1nFdGYtWBjTZP1Rw==',key_name='tempest-keypair-1290889891',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9283fe3c9a094f9bbddb08e48973da44',ramdisk_id='',reservation_id='r-pap560xw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1051644628',owner_user_name='tempest-AttachVolumeShelveTestJSON-1051644628-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:09:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a5ec05fe7d7244368c7eec3739a96c19',uuid=cbaa3995-f00d-4194-b7e2-29bfc6e27614,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "11c4742a-b778-458f-9a76-1a8d6330f415", "address": "fa:16:3e:43:2f:41", "network": {"id": "2aae2552-3ef8-41d3-84e6-313da6fc203b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1626967335-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9283fe3c9a094f9bbddb08e48973da44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c4742a-b7", "ovs_interfaceid": "11c4742a-b778-458f-9a76-1a8d6330f415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Converting VIF {"id": "11c4742a-b778-458f-9a76-1a8d6330f415", "address": "fa:16:3e:43:2f:41", "network": {"id": "2aae2552-3ef8-41d3-84e6-313da6fc203b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1626967335-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9283fe3c9a094f9bbddb08e48973da44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c4742a-b7", "ovs_interfaceid": "11c4742a-b778-458f-9a76-1a8d6330f415", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:43:2f:41,bridge_name='br-int',has_traffic_filtering=True,id=11c4742a-b778-458f-9a76-1a8d6330f415,network=Network(2aae2552-3ef8-41d3-84e6-313da6fc203b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11c4742a-b7') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG os_vif [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:2f:41,bridge_name='br-int',has_traffic_filtering=True,id=11c4742a-b778-458f-9a76-1a8d6330f415,network=Network(2aae2552-3ef8-41d3-84e6-313da6fc203b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11c4742a-b7') {{(pid=71972) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.compute.manager [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Instance event wait completed in 0 seconds for {{(pid=71972) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Guest created on hypervisor {{(pid=71972) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd22f6b6c-44, may_exist=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd22f6b6c-44, col_values=(('external_ids', {'iface-id': 'd22f6b6c-44f0-472e-b05f-192e12d56f32', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ab:f9:49', 'vm-uuid': '9634492c-168d-4b49-941a-b89703571b73'}),)) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:14 user nova-compute[71972]: INFO os_vif [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:f9:49,bridge_name='br-int',has_traffic_filtering=True,id=d22f6b6c-44f0-472e-b05f-192e12d56f32,network=Network(bd20802f-022c-4a11-8310-49b2375f642e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd22f6b6c-44') Apr 17 22:09:14 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap11c4742a-b7, may_exist=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap11c4742a-b7, col_values=(('external_ids', {'iface-id': '11c4742a-b778-458f-9a76-1a8d6330f415', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:43:2f:41', 'vm-uuid': 'cbaa3995-f00d-4194-b7e2-29bfc6e27614'}),)) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:09:14 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Instance spawned successfully. Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:14 user nova-compute[71972]: INFO os_vif [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:2f:41,bridge_name='br-int',has_traffic_filtering=True,id=11c4742a-b778-458f-9a76-1a8d6330f415,network=Network(2aae2552-3ef8-41d3-84e6-313da6fc203b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11c4742a-b7') Apr 17 22:09:14 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Started> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:09:14 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] VM Started (Lifecycle Event) Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Found default for hw_cdrom_bus of ide {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Found default for hw_disk_bus of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Found default for hw_input_bus of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Found default for hw_pointer_model of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Found default for hw_video_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Found default for hw_vif_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:09:14 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Resumed> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:09:14 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] VM Resumed (Lifecycle Event) Apr 17 22:09:14 user nova-compute[71972]: INFO nova.compute.manager [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Took 18.12 seconds to spawn the instance on the hypervisor. Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.compute.manager [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.compute.manager [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Instance event wait completed in 0 seconds for {{(pid=71972) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Guest created on hypervisor {{(pid=71972) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.compute.manager [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Instance event wait completed in 0 seconds for {{(pid=71972) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Guest created on hypervisor {{(pid=71972) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.compute.manager [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Instance event wait completed in 0 seconds for {{(pid=71972) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Guest created on hypervisor {{(pid=71972) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] No BDM found with device name vda, not building metadata. {{(pid=71972) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] No VIF found with MAC fa:16:3e:ab:f9:49, not building metadata {{(pid=71972) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] No BDM found with device name vda, not building metadata. {{(pid=71972) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] No VIF found with MAC fa:16:3e:43:2f:41, not building metadata {{(pid=71972) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 22:09:14 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Instance spawned successfully. Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:09:14 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: 97067629-e099-49fd-bb79-223dd4401405] Instance spawned successfully. Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 22:09:14 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Instance spawned successfully. Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:14 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Started> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:09:14 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] VM Started (Lifecycle Event) Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Found default for hw_cdrom_bus of ide {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Found default for hw_disk_bus of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Found default for hw_input_bus of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Found default for hw_pointer_model of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Found default for hw_video_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Found default for hw_vif_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: INFO nova.compute.manager [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Took 18.99 seconds to spawn the instance on the hypervisor. Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.compute.manager [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Found default for hw_cdrom_bus of ide {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Found default for hw_disk_bus of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Found default for hw_input_bus of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Found default for hw_pointer_model of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Found default for hw_video_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Found default for hw_vif_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Found default for hw_cdrom_bus of ide {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Found default for hw_disk_bus of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Found default for hw_input_bus of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Found default for hw_pointer_model of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Found default for hw_video_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Found default for hw_vif_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:09:14 user nova-compute[71972]: INFO nova.compute.manager [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Took 19.69 seconds to build instance. Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:09:14 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Resumed> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:09:14 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] VM Resumed (Lifecycle Event) Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-81bc81bb-e500-401f-835f-7d6e141a4b22 tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Lock "c4fc7798-567a-4002-b056-6c4f02d0e955" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 19.971s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:14 user nova-compute[71972]: INFO nova.compute.manager [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Took 19.99 seconds to build instance. Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:09:14 user nova-compute[71972]: INFO nova.compute.manager [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Took 14.69 seconds to spawn the instance on the hypervisor. Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.compute.manager [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:14 user nova-compute[71972]: INFO nova.compute.manager [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Took 18.91 seconds to spawn the instance on the hypervisor. Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.compute.manager [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.compute.manager [req-9116be8b-6acc-4e8b-9716-6f7dd4272af4 req-73f79055-48b6-44eb-a6fb-263718b1c532 service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] Received event network-vif-plugged-63dc9a41-e89e-4673-a658-7acddd88706f {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-9116be8b-6acc-4e8b-9716-6f7dd4272af4 req-73f79055-48b6-44eb-a6fb-263718b1c532 service nova] Acquiring lock "97067629-e099-49fd-bb79-223dd4401405-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-9116be8b-6acc-4e8b-9716-6f7dd4272af4 req-73f79055-48b6-44eb-a6fb-263718b1c532 service nova] Lock "97067629-e099-49fd-bb79-223dd4401405-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-9116be8b-6acc-4e8b-9716-6f7dd4272af4 req-73f79055-48b6-44eb-a6fb-263718b1c532 service nova] Lock "97067629-e099-49fd-bb79-223dd4401405-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.compute.manager [req-9116be8b-6acc-4e8b-9716-6f7dd4272af4 req-73f79055-48b6-44eb-a6fb-263718b1c532 service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] No waiting events found dispatching network-vif-plugged-63dc9a41-e89e-4673-a658-7acddd88706f {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:09:14 user nova-compute[71972]: WARNING nova.compute.manager [req-9116be8b-6acc-4e8b-9716-6f7dd4272af4 req-73f79055-48b6-44eb-a6fb-263718b1c532 service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] Received unexpected event network-vif-plugged-63dc9a41-e89e-4673-a658-7acddd88706f for instance with vm_state building and task_state spawning. Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.compute.manager [req-9116be8b-6acc-4e8b-9716-6f7dd4272af4 req-73f79055-48b6-44eb-a6fb-263718b1c532 service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] Received event network-vif-plugged-63dc9a41-e89e-4673-a658-7acddd88706f {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-9116be8b-6acc-4e8b-9716-6f7dd4272af4 req-73f79055-48b6-44eb-a6fb-263718b1c532 service nova] Acquiring lock "97067629-e099-49fd-bb79-223dd4401405-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-9116be8b-6acc-4e8b-9716-6f7dd4272af4 req-73f79055-48b6-44eb-a6fb-263718b1c532 service nova] Lock "97067629-e099-49fd-bb79-223dd4401405-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-9116be8b-6acc-4e8b-9716-6f7dd4272af4 req-73f79055-48b6-44eb-a6fb-263718b1c532 service nova] Lock "97067629-e099-49fd-bb79-223dd4401405-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.compute.manager [req-9116be8b-6acc-4e8b-9716-6f7dd4272af4 req-73f79055-48b6-44eb-a6fb-263718b1c532 service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] No waiting events found dispatching network-vif-plugged-63dc9a41-e89e-4673-a658-7acddd88706f {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:09:14 user nova-compute[71972]: WARNING nova.compute.manager [req-9116be8b-6acc-4e8b-9716-6f7dd4272af4 req-73f79055-48b6-44eb-a6fb-263718b1c532 service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] Received unexpected event network-vif-plugged-63dc9a41-e89e-4673-a658-7acddd88706f for instance with vm_state building and task_state spawning. Apr 17 22:09:14 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Started> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:09:14 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] VM Started (Lifecycle Event) Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9d4d25b-efdf-40f3-b3a7-63e5eb9ef514 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "3681b009-1a99-4eb8-b189-3fe0647f5d1d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 20.266s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:14 user nova-compute[71972]: INFO nova.compute.manager [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Took 21.20 seconds to spawn the instance on the hypervisor. Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.compute.manager [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:14 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:09:15 user nova-compute[71972]: DEBUG nova.compute.manager [req-fc019e2a-833e-4ef8-9c1f-825045e8cfb3 req-daaf083b-9735-467e-b229-d88c11c0d02c service nova] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Received event network-vif-plugged-06548a29-a501-4b57-97f1-8afe930c8463 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:15 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-fc019e2a-833e-4ef8-9c1f-825045e8cfb3 req-daaf083b-9735-467e-b229-d88c11c0d02c service nova] Acquiring lock "7bb0580b-727f-4168-9d56-56dcb4fa404e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:15 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-fc019e2a-833e-4ef8-9c1f-825045e8cfb3 req-daaf083b-9735-467e-b229-d88c11c0d02c service nova] Lock "7bb0580b-727f-4168-9d56-56dcb4fa404e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:15 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-fc019e2a-833e-4ef8-9c1f-825045e8cfb3 req-daaf083b-9735-467e-b229-d88c11c0d02c service nova] Lock "7bb0580b-727f-4168-9d56-56dcb4fa404e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:15 user nova-compute[71972]: DEBUG nova.compute.manager [req-fc019e2a-833e-4ef8-9c1f-825045e8cfb3 req-daaf083b-9735-467e-b229-d88c11c0d02c service nova] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] No waiting events found dispatching network-vif-plugged-06548a29-a501-4b57-97f1-8afe930c8463 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:09:15 user nova-compute[71972]: WARNING nova.compute.manager [req-fc019e2a-833e-4ef8-9c1f-825045e8cfb3 req-daaf083b-9735-467e-b229-d88c11c0d02c service nova] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Received unexpected event network-vif-plugged-06548a29-a501-4b57-97f1-8afe930c8463 for instance with vm_state building and task_state spawning. Apr 17 22:09:15 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.560s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:15 user nova-compute[71972]: DEBUG nova.compute.manager [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Start building networks asynchronously for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 22:09:15 user nova-compute[71972]: INFO nova.compute.manager [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Took 20.28 seconds to build instance. Apr 17 22:09:15 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:09:15 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Resumed> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:09:15 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] VM Resumed (Lifecycle Event) Apr 17 22:09:15 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:15 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-47680b27-e388-4a06-999d-3da934699006 tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Lock "99cb7131-abb8-41d6-bddd-c3bc943b7678" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 20.474s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:15 user nova-compute[71972]: INFO nova.compute.manager [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Took 22.30 seconds to build instance. Apr 17 22:09:15 user nova-compute[71972]: INFO nova.compute.manager [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Took 15.71 seconds to build instance. Apr 17 22:09:15 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:09:15 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5c58665d-9b98-4ee6-8a5c-0a95368fda14 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "97067629-e099-49fd-bb79-223dd4401405" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 15.832s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:15 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Started> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:09:15 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] VM Started (Lifecycle Event) Apr 17 22:09:15 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:15 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:09:15 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-8c40e09b-919c-4b14-b8e6-e6a5b5ad50f6 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "7bb0580b-727f-4168-9d56-56dcb4fa404e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 22.694s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:15 user nova-compute[71972]: DEBUG nova.compute.manager [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Allocating IP information in the background. {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 22:09:15 user nova-compute[71972]: DEBUG nova.network.neutron [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] allocate_for_instance() {{(pid=71972) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 22:09:15 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Resumed> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:09:15 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 97067629-e099-49fd-bb79-223dd4401405] VM Resumed (Lifecycle Event) Apr 17 22:09:15 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Ignoring supplied device name: /dev/sda. Libvirt can't honour user-supplied dev names Apr 17 22:09:15 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 97067629-e099-49fd-bb79-223dd4401405] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:15 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 97067629-e099-49fd-bb79-223dd4401405] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:09:15 user nova-compute[71972]: DEBUG nova.compute.manager [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Start building block device mappings for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 22:09:15 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Started> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:09:15 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 97067629-e099-49fd-bb79-223dd4401405] VM Started (Lifecycle Event) Apr 17 22:09:15 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 97067629-e099-49fd-bb79-223dd4401405] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:15 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 97067629-e099-49fd-bb79-223dd4401405] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:09:15 user nova-compute[71972]: DEBUG nova.compute.manager [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Start spawning the instance on the hypervisor. {{(pid=71972) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 22:09:15 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Creating instance directory {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 22:09:15 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Creating image(s) Apr 17 22:09:15 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Acquiring lock "/opt/stack/data/nova/instances/b2751b9c-c966-416d-aaaa-81756198849c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:15 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Lock "/opt/stack/data/nova/instances/b2751b9c-c966-416d-aaaa-81756198849c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:15 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Lock "/opt/stack/data/nova/instances/b2751b9c-c966-416d-aaaa-81756198849c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:15 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Acquiring lock "18a6ad4769f6b724efc3abf06f13a3648ad9473f" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:15 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Lock "18a6ad4769f6b724efc3abf06f13a3648ad9473f" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.002s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:15 user nova-compute[71972]: DEBUG nova.policy [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '79a02ae084a541b1b7a3fda0190b9ae4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fd70480cd5364a3185fe097f88c290ae', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71972) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 22:09:15 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:16 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/18a6ad4769f6b724efc3abf06f13a3648ad9473f.part --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:16 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:16 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:16 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:16 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/18a6ad4769f6b724efc3abf06f13a3648ad9473f.part --force-share --output=json" returned: 0 in 0.191s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:16 user nova-compute[71972]: DEBUG nova.virt.images [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] caf3ea13-92a9-40f8-bd4a-51f6b5c53327 was qcow2, converting to raw {{(pid=71972) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 17 22:09:16 user nova-compute[71972]: DEBUG nova.privsep.utils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71972) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 17 22:09:16 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/18a6ad4769f6b724efc3abf06f13a3648ad9473f.part /opt/stack/data/nova/instances/_base/18a6ad4769f6b724efc3abf06f13a3648ad9473f.converted {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:16 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:16 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/18a6ad4769f6b724efc3abf06f13a3648ad9473f.part /opt/stack/data/nova/instances/_base/18a6ad4769f6b724efc3abf06f13a3648ad9473f.converted" returned: 0 in 0.314s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:16 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/18a6ad4769f6b724efc3abf06f13a3648ad9473f.converted --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:16 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:17 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/18a6ad4769f6b724efc3abf06f13a3648ad9473f.converted --force-share --output=json" returned: 0 in 0.256s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:17 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Lock "18a6ad4769f6b724efc3abf06f13a3648ad9473f" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 1.316s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:17 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/18a6ad4769f6b724efc3abf06f13a3648ad9473f --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:17 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/18a6ad4769f6b724efc3abf06f13a3648ad9473f --force-share --output=json" returned: 0 in 0.162s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:17 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Acquiring lock "18a6ad4769f6b724efc3abf06f13a3648ad9473f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:17 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Lock "18a6ad4769f6b724efc3abf06f13a3648ad9473f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:17 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/18a6ad4769f6b724efc3abf06f13a3648ad9473f --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:17 user nova-compute[71972]: DEBUG nova.network.neutron [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Successfully created port: ece1440f-e4b0-4eef-9ab4-6a741473bf2f {{(pid=71972) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 22:09:17 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/18a6ad4769f6b724efc3abf06f13a3648ad9473f --force-share --output=json" returned: 0 in 0.187s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:17 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/18a6ad4769f6b724efc3abf06f13a3648ad9473f,backing_fmt=raw /opt/stack/data/nova/instances/b2751b9c-c966-416d-aaaa-81756198849c/disk 1073741824 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:17 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:17 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/18a6ad4769f6b724efc3abf06f13a3648ad9473f,backing_fmt=raw /opt/stack/data/nova/instances/b2751b9c-c966-416d-aaaa-81756198849c/disk 1073741824" returned: 0 in 0.052s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:17 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Lock "18a6ad4769f6b724efc3abf06f13a3648ad9473f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.244s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:17 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/18a6ad4769f6b724efc3abf06f13a3648ad9473f --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:17 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:17 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/18a6ad4769f6b724efc3abf06f13a3648ad9473f --force-share --output=json" returned: 0 in 0.250s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:17 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Checking if we can resize image /opt/stack/data/nova/instances/b2751b9c-c966-416d-aaaa-81756198849c/disk. size=1073741824 {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 22:09:17 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b2751b9c-c966-416d-aaaa-81756198849c/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:18 user nova-compute[71972]: DEBUG nova.compute.manager [req-ec95ab3f-aace-4e3f-9158-b5ed237e94cd req-5635b66d-c701-4059-a76d-e3deac607b63 service nova] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Received event network-vif-plugged-11c4742a-b778-458f-9a76-1a8d6330f415 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:18 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-ec95ab3f-aace-4e3f-9158-b5ed237e94cd req-5635b66d-c701-4059-a76d-e3deac607b63 service nova] Acquiring lock "cbaa3995-f00d-4194-b7e2-29bfc6e27614-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:18 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-ec95ab3f-aace-4e3f-9158-b5ed237e94cd req-5635b66d-c701-4059-a76d-e3deac607b63 service nova] Lock "cbaa3995-f00d-4194-b7e2-29bfc6e27614-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:18 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-ec95ab3f-aace-4e3f-9158-b5ed237e94cd req-5635b66d-c701-4059-a76d-e3deac607b63 service nova] Lock "cbaa3995-f00d-4194-b7e2-29bfc6e27614-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:18 user nova-compute[71972]: DEBUG nova.compute.manager [req-ec95ab3f-aace-4e3f-9158-b5ed237e94cd req-5635b66d-c701-4059-a76d-e3deac607b63 service nova] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] No waiting events found dispatching network-vif-plugged-11c4742a-b778-458f-9a76-1a8d6330f415 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:09:18 user nova-compute[71972]: WARNING nova.compute.manager [req-ec95ab3f-aace-4e3f-9158-b5ed237e94cd req-5635b66d-c701-4059-a76d-e3deac607b63 service nova] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Received unexpected event network-vif-plugged-11c4742a-b778-458f-9a76-1a8d6330f415 for instance with vm_state building and task_state spawning. Apr 17 22:09:18 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b2751b9c-c966-416d-aaaa-81756198849c/disk --force-share --output=json" returned: 0 in 0.335s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:18 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Cannot resize image /opt/stack/data/nova/instances/b2751b9c-c966-416d-aaaa-81756198849c/disk to a smaller size. {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 22:09:18 user nova-compute[71972]: DEBUG nova.objects.instance [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Lazy-loading 'migration_context' on Instance uuid b2751b9c-c966-416d-aaaa-81756198849c {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:09:18 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Created local disks {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 22:09:18 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Ensure instance console log exists: /opt/stack/data/nova/instances/b2751b9c-c966-416d-aaaa-81756198849c/console.log {{(pid=71972) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 22:09:18 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:18 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:18 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:18 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:18 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:19 user nova-compute[71972]: DEBUG nova.compute.manager [req-774ef2cf-5df7-455f-95f0-e0dbfaa05aad req-16d5a77a-ba00-48fa-bcdc-ed43ce158558 service nova] [instance: 9634492c-168d-4b49-941a-b89703571b73] Received event network-vif-plugged-d22f6b6c-44f0-472e-b05f-192e12d56f32 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:19 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-774ef2cf-5df7-455f-95f0-e0dbfaa05aad req-16d5a77a-ba00-48fa-bcdc-ed43ce158558 service nova] Acquiring lock "9634492c-168d-4b49-941a-b89703571b73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:19 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-774ef2cf-5df7-455f-95f0-e0dbfaa05aad req-16d5a77a-ba00-48fa-bcdc-ed43ce158558 service nova] Lock "9634492c-168d-4b49-941a-b89703571b73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:19 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-774ef2cf-5df7-455f-95f0-e0dbfaa05aad req-16d5a77a-ba00-48fa-bcdc-ed43ce158558 service nova] Lock "9634492c-168d-4b49-941a-b89703571b73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:19 user nova-compute[71972]: DEBUG nova.compute.manager [req-774ef2cf-5df7-455f-95f0-e0dbfaa05aad req-16d5a77a-ba00-48fa-bcdc-ed43ce158558 service nova] [instance: 9634492c-168d-4b49-941a-b89703571b73] No waiting events found dispatching network-vif-plugged-d22f6b6c-44f0-472e-b05f-192e12d56f32 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:09:19 user nova-compute[71972]: WARNING nova.compute.manager [req-774ef2cf-5df7-455f-95f0-e0dbfaa05aad req-16d5a77a-ba00-48fa-bcdc-ed43ce158558 service nova] [instance: 9634492c-168d-4b49-941a-b89703571b73] Received unexpected event network-vif-plugged-d22f6b6c-44f0-472e-b05f-192e12d56f32 for instance with vm_state building and task_state spawning. Apr 17 22:09:19 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:19 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:19 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:19 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:20 user nova-compute[71972]: DEBUG nova.compute.manager [req-201079a3-5b9c-4d4c-ac7d-006485547801 req-c0a8fd9f-4bb2-4e7d-b6bc-bf2914d5899d service nova] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Received event network-vif-plugged-11c4742a-b778-458f-9a76-1a8d6330f415 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:20 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-201079a3-5b9c-4d4c-ac7d-006485547801 req-c0a8fd9f-4bb2-4e7d-b6bc-bf2914d5899d service nova] Acquiring lock "cbaa3995-f00d-4194-b7e2-29bfc6e27614-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:20 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-201079a3-5b9c-4d4c-ac7d-006485547801 req-c0a8fd9f-4bb2-4e7d-b6bc-bf2914d5899d service nova] Lock "cbaa3995-f00d-4194-b7e2-29bfc6e27614-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:20 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-201079a3-5b9c-4d4c-ac7d-006485547801 req-c0a8fd9f-4bb2-4e7d-b6bc-bf2914d5899d service nova] Lock "cbaa3995-f00d-4194-b7e2-29bfc6e27614-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:20 user nova-compute[71972]: DEBUG nova.compute.manager [req-201079a3-5b9c-4d4c-ac7d-006485547801 req-c0a8fd9f-4bb2-4e7d-b6bc-bf2914d5899d service nova] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] No waiting events found dispatching network-vif-plugged-11c4742a-b778-458f-9a76-1a8d6330f415 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:09:20 user nova-compute[71972]: WARNING nova.compute.manager [req-201079a3-5b9c-4d4c-ac7d-006485547801 req-c0a8fd9f-4bb2-4e7d-b6bc-bf2914d5899d service nova] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Received unexpected event network-vif-plugged-11c4742a-b778-458f-9a76-1a8d6330f415 for instance with vm_state building and task_state spawning. Apr 17 22:09:20 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:20 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:20 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:20 user nova-compute[71972]: DEBUG nova.compute.manager [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Instance event wait completed in 0 seconds for {{(pid=71972) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 22:09:20 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Guest created on hypervisor {{(pid=71972) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 22:09:20 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Resumed> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:09:20 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 9634492c-168d-4b49-941a-b89703571b73] VM Resumed (Lifecycle Event) Apr 17 22:09:20 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: 9634492c-168d-4b49-941a-b89703571b73] Instance spawned successfully. Apr 17 22:09:20 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 9634492c-168d-4b49-941a-b89703571b73] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 9634492c-168d-4b49-941a-b89703571b73] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Found default for hw_cdrom_bus of ide {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Found default for hw_disk_bus of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Found default for hw_input_bus of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Found default for hw_pointer_model of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Found default for hw_video_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Found default for hw_vif_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.compute.manager [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Instance event wait completed in 0 seconds for {{(pid=71972) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Guest created on hypervisor {{(pid=71972) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.compute.manager [req-35b560d0-af79-46fa-8abe-57471b97c783 req-b78315b9-588d-46cd-a4d5-44dc4e2309ee service nova] [instance: 9634492c-168d-4b49-941a-b89703571b73] Received event network-vif-plugged-d22f6b6c-44f0-472e-b05f-192e12d56f32 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-35b560d0-af79-46fa-8abe-57471b97c783 req-b78315b9-588d-46cd-a4d5-44dc4e2309ee service nova] Acquiring lock "9634492c-168d-4b49-941a-b89703571b73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-35b560d0-af79-46fa-8abe-57471b97c783 req-b78315b9-588d-46cd-a4d5-44dc4e2309ee service nova] Lock "9634492c-168d-4b49-941a-b89703571b73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-35b560d0-af79-46fa-8abe-57471b97c783 req-b78315b9-588d-46cd-a4d5-44dc4e2309ee service nova] Lock "9634492c-168d-4b49-941a-b89703571b73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.compute.manager [req-35b560d0-af79-46fa-8abe-57471b97c783 req-b78315b9-588d-46cd-a4d5-44dc4e2309ee service nova] [instance: 9634492c-168d-4b49-941a-b89703571b73] No waiting events found dispatching network-vif-plugged-d22f6b6c-44f0-472e-b05f-192e12d56f32 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:09:21 user nova-compute[71972]: WARNING nova.compute.manager [req-35b560d0-af79-46fa-8abe-57471b97c783 req-b78315b9-588d-46cd-a4d5-44dc4e2309ee service nova] [instance: 9634492c-168d-4b49-941a-b89703571b73] Received unexpected event network-vif-plugged-d22f6b6c-44f0-472e-b05f-192e12d56f32 for instance with vm_state building and task_state spawning. Apr 17 22:09:21 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 9634492c-168d-4b49-941a-b89703571b73] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Started> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:09:21 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 9634492c-168d-4b49-941a-b89703571b73] VM Started (Lifecycle Event) Apr 17 22:09:21 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Instance spawned successfully. Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 9634492c-168d-4b49-941a-b89703571b73] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 9634492c-168d-4b49-941a-b89703571b73] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Found default for hw_cdrom_bus of ide {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Found default for hw_disk_bus of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Found default for hw_input_bus of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Found default for hw_pointer_model of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Found default for hw_video_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Found default for hw_vif_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:21 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 9634492c-168d-4b49-941a-b89703571b73] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Resumed> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:09:21 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] VM Resumed (Lifecycle Event) Apr 17 22:09:21 user nova-compute[71972]: INFO nova.compute.manager [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Took 23.72 seconds to spawn the instance on the hypervisor. Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.compute.manager [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:09:21 user nova-compute[71972]: INFO nova.compute.manager [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Took 19.36 seconds to spawn the instance on the hypervisor. Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.compute.manager [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:21 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Started> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:09:21 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] VM Started (Lifecycle Event) Apr 17 22:09:21 user nova-compute[71972]: INFO nova.compute.manager [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Took 24.85 seconds to build instance. Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:09:21 user nova-compute[71972]: INFO nova.compute.manager [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Took 20.30 seconds to build instance. Apr 17 22:09:21 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-40dc8ad9-f804-48ad-a816-665a1592bf27 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "9634492c-168d-4b49-941a-b89703571b73" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 25.014s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-e8d8084f-c1d9-42b7-8db7-daa95bf03c02 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "cbaa3995-f00d-4194-b7e2-29bfc6e27614" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 20.468s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.network.neutron [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Successfully updated port: ece1440f-e4b0-4eef-9ab4-6a741473bf2f {{(pid=71972) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Acquiring lock "refresh_cache-b2751b9c-c966-416d-aaaa-81756198849c" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Acquired lock "refresh_cache-b2751b9c-c966-416d-aaaa-81756198849c" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.network.neutron [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Building network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 22:09:21 user nova-compute[71972]: DEBUG nova.network.neutron [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Instance cache missing network info. {{(pid=71972) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 22:09:22 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:22 user nova-compute[71972]: DEBUG nova.network.neutron [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Updating instance_info_cache with network_info: [{"id": "ece1440f-e4b0-4eef-9ab4-6a741473bf2f", "address": "fa:16:3e:96:4d:19", "network": {"id": "d5f5ecb0-5758-4318-bb86-c30fc214049a", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-43150130-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fd70480cd5364a3185fe097f88c290ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapece1440f-e4", "ovs_interfaceid": "ece1440f-e4b0-4eef-9ab4-6a741473bf2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:09:22 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Releasing lock "refresh_cache-b2751b9c-c966-416d-aaaa-81756198849c" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:09:22 user nova-compute[71972]: DEBUG nova.compute.manager [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Instance network_info: |[{"id": "ece1440f-e4b0-4eef-9ab4-6a741473bf2f", "address": "fa:16:3e:96:4d:19", "network": {"id": "d5f5ecb0-5758-4318-bb86-c30fc214049a", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-43150130-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fd70480cd5364a3185fe097f88c290ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapece1440f-e4", "ovs_interfaceid": "ece1440f-e4b0-4eef-9ab4-6a741473bf2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 22:09:22 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Start _get_guest_xml network_info=[{"id": "ece1440f-e4b0-4eef-9ab4-6a741473bf2f", "address": "fa:16:3e:96:4d:19", "network": {"id": "d5f5ecb0-5758-4318-bb86-c30fc214049a", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-43150130-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fd70480cd5364a3185fe097f88c290ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapece1440f-e4", "ovs_interfaceid": "ece1440f-e4b0-4eef-9ab4-6a741473bf2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'scsi', 'cdrom_bus': 'scsi', 'mapping': {'root': {'bus': 'scsi', 'dev': 'sda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'scsi', 'dev': 'sda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'scsi', 'dev': 'sdb', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:09:01Z,direct_url=,disk_format='qcow2',id=caf3ea13-92a9-40f8-bd4a-51f6b5c53327,min_disk=0,min_ram=0,name='',owner='5fa4a52bd9634a16b9c192ae79d08c6d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:09:03Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/sda', 'image': [{'disk_bus': 'scsi', 'boot_index': 0, 'device_name': '/dev/sda', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'image_id': 'caf3ea13-92a9-40f8-bd4a-51f6b5c53327'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 22:09:22 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:22 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:22 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:09:22 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:09:22 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71972) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 22:09:22 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T22:06:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:09:01Z,direct_url=,disk_format='qcow2',id=caf3ea13-92a9-40f8-bd4a-51f6b5c53327,min_disk=0,min_ram=0,name='',owner='5fa4a52bd9634a16b9c192ae79d08c6d',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:09:03Z,virtual_size=,visibility=), allow threads: True {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 22:09:22 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Flavor limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 22:09:22 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Image limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Flavor pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Image pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Got 1 possible topologies {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2023-04-17T22:09:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-518621003',display_name='tempest-AttachSCSIVolumeTestJSON-server-518621003',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-518621003',id=8,image_ref='caf3ea13-92a9-40f8-bd4a-51f6b5c53327',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAXWDpAmHJN6DEa8aBiSGZSIAf3NwsuJLBqtY30pGhgpUFLMNyDpGDxGPXPI9IR47QvVMEhsqH5smKDhL4yjmsCvpuEmnNtj1uTFOgYDJS1VE0IAdB4c78I6NkPJBVnexQ==',key_name='tempest-keypair-1545084897',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd70480cd5364a3185fe097f88c290ae',ramdisk_id='',reservation_id='r-x3b00iug',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='caf3ea13-92a9-40f8-bd4a-51f6b5c53327',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_machine_type='pc',image_hw_scsi_model='virtio-scsi',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachSCSIVolumeTestJSON-885264932',owner_user_name='tempest-AttachSCSIVolumeTestJSON-885264932-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:09:16Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='79a02ae084a541b1b7a3fda0190b9ae4',uuid=b2751b9c-c966-416d-aaaa-81756198849c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ece1440f-e4b0-4eef-9ab4-6a741473bf2f", "address": "fa:16:3e:96:4d:19", "network": {"id": "d5f5ecb0-5758-4318-bb86-c30fc214049a", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-43150130-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fd70480cd5364a3185fe097f88c290ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapece1440f-e4", "ovs_interfaceid": "ece1440f-e4b0-4eef-9ab4-6a741473bf2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71972) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Converting VIF {"id": "ece1440f-e4b0-4eef-9ab4-6a741473bf2f", "address": "fa:16:3e:96:4d:19", "network": {"id": "d5f5ecb0-5758-4318-bb86-c30fc214049a", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-43150130-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fd70480cd5364a3185fe097f88c290ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapece1440f-e4", "ovs_interfaceid": "ece1440f-e4b0-4eef-9ab4-6a741473bf2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:96:4d:19,bridge_name='br-int',has_traffic_filtering=True,id=ece1440f-e4b0-4eef-9ab4-6a741473bf2f,network=Network(d5f5ecb0-5758-4318-bb86-c30fc214049a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapece1440f-e4') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG nova.objects.instance [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Lazy-loading 'pci_devices' on Instance uuid b2751b9c-c966-416d-aaaa-81756198849c {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] End _get_guest_xml xml= Apr 17 22:09:23 user nova-compute[71972]: b2751b9c-c966-416d-aaaa-81756198849c Apr 17 22:09:23 user nova-compute[71972]: instance-00000008 Apr 17 22:09:23 user nova-compute[71972]: 131072 Apr 17 22:09:23 user nova-compute[71972]: 1 Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: tempest-AttachSCSIVolumeTestJSON-server-518621003 Apr 17 22:09:23 user nova-compute[71972]: 2023-04-17 22:09:22 Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: 128 Apr 17 22:09:23 user nova-compute[71972]: 1 Apr 17 22:09:23 user nova-compute[71972]: 0 Apr 17 22:09:23 user nova-compute[71972]: 0 Apr 17 22:09:23 user nova-compute[71972]: 1 Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: tempest-AttachSCSIVolumeTestJSON-885264932-project-member Apr 17 22:09:23 user nova-compute[71972]: tempest-AttachSCSIVolumeTestJSON-885264932 Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: OpenStack Foundation Apr 17 22:09:23 user nova-compute[71972]: OpenStack Nova Apr 17 22:09:23 user nova-compute[71972]: 0.0.0 Apr 17 22:09:23 user nova-compute[71972]: b2751b9c-c966-416d-aaaa-81756198849c Apr 17 22:09:23 user nova-compute[71972]: b2751b9c-c966-416d-aaaa-81756198849c Apr 17 22:09:23 user nova-compute[71972]: Virtual Machine Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: hvm Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Nehalem Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]:
Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]:
Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: /dev/urandom Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: Apr 17 22:09:23 user nova-compute[71972]: {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2023-04-17T22:09:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-518621003',display_name='tempest-AttachSCSIVolumeTestJSON-server-518621003',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-518621003',id=8,image_ref='caf3ea13-92a9-40f8-bd4a-51f6b5c53327',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAXWDpAmHJN6DEa8aBiSGZSIAf3NwsuJLBqtY30pGhgpUFLMNyDpGDxGPXPI9IR47QvVMEhsqH5smKDhL4yjmsCvpuEmnNtj1uTFOgYDJS1VE0IAdB4c78I6NkPJBVnexQ==',key_name='tempest-keypair-1545084897',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fd70480cd5364a3185fe097f88c290ae',ramdisk_id='',reservation_id='r-x3b00iug',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='caf3ea13-92a9-40f8-bd4a-51f6b5c53327',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_machine_type='pc',image_hw_scsi_model='virtio-scsi',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachSCSIVolumeTestJSON-885264932',owner_user_name='tempest-AttachSCSIVolumeTestJSON-885264932-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:09:16Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='79a02ae084a541b1b7a3fda0190b9ae4',uuid=b2751b9c-c966-416d-aaaa-81756198849c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ece1440f-e4b0-4eef-9ab4-6a741473bf2f", "address": "fa:16:3e:96:4d:19", "network": {"id": "d5f5ecb0-5758-4318-bb86-c30fc214049a", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-43150130-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fd70480cd5364a3185fe097f88c290ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapece1440f-e4", "ovs_interfaceid": "ece1440f-e4b0-4eef-9ab4-6a741473bf2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Converting VIF {"id": "ece1440f-e4b0-4eef-9ab4-6a741473bf2f", "address": "fa:16:3e:96:4d:19", "network": {"id": "d5f5ecb0-5758-4318-bb86-c30fc214049a", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-43150130-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fd70480cd5364a3185fe097f88c290ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapece1440f-e4", "ovs_interfaceid": "ece1440f-e4b0-4eef-9ab4-6a741473bf2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:96:4d:19,bridge_name='br-int',has_traffic_filtering=True,id=ece1440f-e4b0-4eef-9ab4-6a741473bf2f,network=Network(d5f5ecb0-5758-4318-bb86-c30fc214049a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapece1440f-e4') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG os_vif [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:96:4d:19,bridge_name='br-int',has_traffic_filtering=True,id=ece1440f-e4b0-4eef-9ab4-6a741473bf2f,network=Network(d5f5ecb0-5758-4318-bb86-c30fc214049a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapece1440f-e4') {{(pid=71972) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapece1440f-e4, may_exist=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapece1440f-e4, col_values=(('external_ids', {'iface-id': 'ece1440f-e4b0-4eef-9ab4-6a741473bf2f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:96:4d:19', 'vm-uuid': 'b2751b9c-c966-416d-aaaa-81756198849c'}),)) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:23 user nova-compute[71972]: INFO os_vif [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:96:4d:19,bridge_name='br-int',has_traffic_filtering=True,id=ece1440f-e4b0-4eef-9ab4-6a741473bf2f,network=Network(d5f5ecb0-5758-4318-bb86-c30fc214049a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapece1440f-e4') Apr 17 22:09:23 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] No BDM found with device name sda, not building metadata. {{(pid=71972) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] No BDM found with device name sdb, not building metadata. {{(pid=71972) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] No VIF found with MAC fa:16:3e:96:4d:19, not building metadata {{(pid=71972) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 22:09:23 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Using config drive Apr 17 22:09:23 user nova-compute[71972]: DEBUG nova.compute.manager [req-6f0fe6b5-75cf-48b2-8532-340423a6f91e req-30e7708d-10c8-4b2a-8c27-fd6b6af39c41 service nova] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Received event network-changed-ece1440f-e4b0-4eef-9ab4-6a741473bf2f {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG nova.compute.manager [req-6f0fe6b5-75cf-48b2-8532-340423a6f91e req-30e7708d-10c8-4b2a-8c27-fd6b6af39c41 service nova] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Refreshing instance network info cache due to event network-changed-ece1440f-e4b0-4eef-9ab4-6a741473bf2f. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-6f0fe6b5-75cf-48b2-8532-340423a6f91e req-30e7708d-10c8-4b2a-8c27-fd6b6af39c41 service nova] Acquiring lock "refresh_cache-b2751b9c-c966-416d-aaaa-81756198849c" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-6f0fe6b5-75cf-48b2-8532-340423a6f91e req-30e7708d-10c8-4b2a-8c27-fd6b6af39c41 service nova] Acquired lock "refresh_cache-b2751b9c-c966-416d-aaaa-81756198849c" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG nova.network.neutron [req-6f0fe6b5-75cf-48b2-8532-340423a6f91e req-30e7708d-10c8-4b2a-8c27-fd6b6af39c41 service nova] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Refreshing network info cache for port ece1440f-e4b0-4eef-9ab4-6a741473bf2f {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:23 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:24 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Creating config drive at /opt/stack/data/nova/instances/b2751b9c-c966-416d-aaaa-81756198849c/disk.config Apr 17 22:09:24 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Running cmd (subprocess): genisoimage -o /opt/stack/data/nova/instances/b2751b9c-c966-416d-aaaa-81756198849c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 0.0.0 -quiet -J -r -V config-2 /tmp/tmpcuyyu3x6 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:24 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] CMD "genisoimage -o /opt/stack/data/nova/instances/b2751b9c-c966-416d-aaaa-81756198849c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 0.0.0 -quiet -J -r -V config-2 /tmp/tmpcuyyu3x6" returned: 0 in 0.092s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:24 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:24 user nova-compute[71972]: DEBUG nova.network.neutron [req-6f0fe6b5-75cf-48b2-8532-340423a6f91e req-30e7708d-10c8-4b2a-8c27-fd6b6af39c41 service nova] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Updated VIF entry in instance network info cache for port ece1440f-e4b0-4eef-9ab4-6a741473bf2f. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:09:24 user nova-compute[71972]: DEBUG nova.network.neutron [req-6f0fe6b5-75cf-48b2-8532-340423a6f91e req-30e7708d-10c8-4b2a-8c27-fd6b6af39c41 service nova] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Updating instance_info_cache with network_info: [{"id": "ece1440f-e4b0-4eef-9ab4-6a741473bf2f", "address": "fa:16:3e:96:4d:19", "network": {"id": "d5f5ecb0-5758-4318-bb86-c30fc214049a", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-43150130-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fd70480cd5364a3185fe097f88c290ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapece1440f-e4", "ovs_interfaceid": "ece1440f-e4b0-4eef-9ab4-6a741473bf2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:09:24 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-6f0fe6b5-75cf-48b2-8532-340423a6f91e req-30e7708d-10c8-4b2a-8c27-fd6b6af39c41 service nova] Releasing lock "refresh_cache-b2751b9c-c966-416d-aaaa-81756198849c" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:09:25 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:26 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:26 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:26 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:26 user nova-compute[71972]: DEBUG nova.compute.manager [req-1afec163-5771-4a76-ae77-66847137ac96 req-0fc9a5b0-7879-4156-9e52-0ea581c6358d service nova] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Received event network-vif-plugged-ece1440f-e4b0-4eef-9ab4-6a741473bf2f {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:26 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-1afec163-5771-4a76-ae77-66847137ac96 req-0fc9a5b0-7879-4156-9e52-0ea581c6358d service nova] Acquiring lock "b2751b9c-c966-416d-aaaa-81756198849c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:26 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-1afec163-5771-4a76-ae77-66847137ac96 req-0fc9a5b0-7879-4156-9e52-0ea581c6358d service nova] Lock "b2751b9c-c966-416d-aaaa-81756198849c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:26 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-1afec163-5771-4a76-ae77-66847137ac96 req-0fc9a5b0-7879-4156-9e52-0ea581c6358d service nova] Lock "b2751b9c-c966-416d-aaaa-81756198849c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:26 user nova-compute[71972]: DEBUG nova.compute.manager [req-1afec163-5771-4a76-ae77-66847137ac96 req-0fc9a5b0-7879-4156-9e52-0ea581c6358d service nova] [instance: b2751b9c-c966-416d-aaaa-81756198849c] No waiting events found dispatching network-vif-plugged-ece1440f-e4b0-4eef-9ab4-6a741473bf2f {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:09:26 user nova-compute[71972]: WARNING nova.compute.manager [req-1afec163-5771-4a76-ae77-66847137ac96 req-0fc9a5b0-7879-4156-9e52-0ea581c6358d service nova] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Received unexpected event network-vif-plugged-ece1440f-e4b0-4eef-9ab4-6a741473bf2f for instance with vm_state building and task_state spawning. Apr 17 22:09:26 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:27 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:27 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:27 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:27 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:27 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:27 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:28 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:28 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:28 user nova-compute[71972]: DEBUG nova.compute.manager [req-6a371696-e0c6-450f-8cc5-df56645deed6 req-4d218134-d294-41ca-b38f-44da93a5775a service nova] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Received event network-vif-plugged-ece1440f-e4b0-4eef-9ab4-6a741473bf2f {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:28 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-6a371696-e0c6-450f-8cc5-df56645deed6 req-4d218134-d294-41ca-b38f-44da93a5775a service nova] Acquiring lock "b2751b9c-c966-416d-aaaa-81756198849c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:28 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-6a371696-e0c6-450f-8cc5-df56645deed6 req-4d218134-d294-41ca-b38f-44da93a5775a service nova] Lock "b2751b9c-c966-416d-aaaa-81756198849c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:28 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-6a371696-e0c6-450f-8cc5-df56645deed6 req-4d218134-d294-41ca-b38f-44da93a5775a service nova] Lock "b2751b9c-c966-416d-aaaa-81756198849c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:28 user nova-compute[71972]: DEBUG nova.compute.manager [req-6a371696-e0c6-450f-8cc5-df56645deed6 req-4d218134-d294-41ca-b38f-44da93a5775a service nova] [instance: b2751b9c-c966-416d-aaaa-81756198849c] No waiting events found dispatching network-vif-plugged-ece1440f-e4b0-4eef-9ab4-6a741473bf2f {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:09:28 user nova-compute[71972]: WARNING nova.compute.manager [req-6a371696-e0c6-450f-8cc5-df56645deed6 req-4d218134-d294-41ca-b38f-44da93a5775a service nova] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Received unexpected event network-vif-plugged-ece1440f-e4b0-4eef-9ab4-6a741473bf2f for instance with vm_state building and task_state spawning. Apr 17 22:09:28 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Resumed> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:09:28 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: b2751b9c-c966-416d-aaaa-81756198849c] VM Resumed (Lifecycle Event) Apr 17 22:09:28 user nova-compute[71972]: DEBUG nova.compute.manager [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Instance event wait completed in 0 seconds for {{(pid=71972) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 22:09:28 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Guest created on hypervisor {{(pid=71972) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 22:09:28 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Instance spawned successfully. Apr 17 22:09:28 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Attempting to register defaults for the following image properties: ['hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 22:09:28 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Found default for hw_input_bus of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:28 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Found default for hw_pointer_model of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:28 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Found default for hw_video_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:28 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Found default for hw_vif_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:28 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:28 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:09:28 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: b2751b9c-c966-416d-aaaa-81756198849c] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:09:28 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Started> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:09:28 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: b2751b9c-c966-416d-aaaa-81756198849c] VM Started (Lifecycle Event) Apr 17 22:09:29 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:29 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:09:29 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: b2751b9c-c966-416d-aaaa-81756198849c] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:09:29 user nova-compute[71972]: INFO nova.compute.manager [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Took 13.30 seconds to spawn the instance on the hypervisor. Apr 17 22:09:29 user nova-compute[71972]: DEBUG nova.compute.manager [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:29 user nova-compute[71972]: INFO nova.compute.manager [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Took 16.79 seconds to build instance. Apr 17 22:09:29 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-ee0076cd-85ca-4633-884d-b195b91c8682 tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Lock "b2751b9c-c966-416d-aaaa-81756198849c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 16.901s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:32 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:09:32 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:09:32 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Cleaning up deleted instances {{(pid=71972) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 17 22:09:32 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] There are 0 instances to clean {{(pid=71972) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 17 22:09:32 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:09:32 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Cleaning up deleted instances with incomplete migration {{(pid=71972) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 17 22:09:32 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:09:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:33 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:34 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:09:35 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:09:35 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:09:35 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71972) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 22:09:36 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:09:36 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:09:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:36 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Auditing locally available compute resources for user (node: user) {{(pid=71972) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 22:09:36 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:37 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/disk --force-share --output=json" returned: 0 in 0.195s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/disk --force-share --output=json" returned: 0 in 0.166s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3681b009-1a99-4eb8-b189-3fe0647f5d1d/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3681b009-1a99-4eb8-b189-3fe0647f5d1d/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3681b009-1a99-4eb8-b189-3fe0647f5d1d/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3681b009-1a99-4eb8-b189-3fe0647f5d1d/disk --force-share --output=json" returned: 0 in 0.161s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:37 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk --force-share --output=json" returned: 0 in 0.165s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b2751b9c-c966-416d-aaaa-81756198849c/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b2751b9c-c966-416d-aaaa-81756198849c/disk --force-share --output=json" returned: 0 in 0.151s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b2751b9c-c966-416d-aaaa-81756198849c/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:38 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b2751b9c-c966-416d-aaaa-81756198849c/disk --force-share --output=json" returned: 0 in 0.165s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cbaa3995-f00d-4194-b7e2-29bfc6e27614/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cbaa3995-f00d-4194-b7e2-29bfc6e27614/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cbaa3995-f00d-4194-b7e2-29bfc6e27614/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:38 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cbaa3995-f00d-4194-b7e2-29bfc6e27614/disk --force-share --output=json" returned: 0 in 0.188s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json" returned: 0 in 0.207s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4fc7798-567a-4002-b056-6c4f02d0e955/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4fc7798-567a-4002-b056-6c4f02d0e955/disk --force-share --output=json" returned: 0 in 0.164s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4fc7798-567a-4002-b056-6c4f02d0e955/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4fc7798-567a-4002-b056-6c4f02d0e955/disk --force-share --output=json" returned: 0 in 0.165s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:40 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:09:40 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:09:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Hypervisor/Node resource view: name=user free_ram=7787MB free_disk=26.62417221069336GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71972) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 22:09:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance c4fc7798-567a-4002-b056-6c4f02d0e955 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:09:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance cbaa3995-f00d-4194-b7e2-29bfc6e27614 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:09:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 9634492c-168d-4b49-941a-b89703571b73 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:09:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 97067629-e099-49fd-bb79-223dd4401405 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:09:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 99cb7131-abb8-41d6-bddd-c3bc943b7678 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:09:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 3681b009-1a99-4eb8-b189-3fe0647f5d1d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:09:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 7bb0580b-727f-4168-9d56-56dcb4fa404e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:09:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance b2751b9c-c966-416d-aaaa-81756198849c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:09:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Total usable vcpus: 12, total allocated vcpus: 8 {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 22:09:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Final resource view: name=user phys_ram=16023MB used_ram=1536MB phys_disk=40GB used_disk=8GB total_vcpus=12 used_vcpus=8 pci_stats=[] {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 22:09:40 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:09:40 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:09:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Compute_service record updated for user:user {{(pid=71972) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 22:09:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.514s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:41 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:41 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:09:41 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:09:41 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Starting heal instance info cache {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 22:09:41 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Rebuilding the list of instances to heal {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 17 22:09:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "refresh_cache-7bb0580b-727f-4168-9d56-56dcb4fa404e" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:09:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquired lock "refresh_cache-7bb0580b-727f-4168-9d56-56dcb4fa404e" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:09:41 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Forcefully refreshing network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 17 22:09:41 user nova-compute[71972]: DEBUG nova.objects.instance [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lazy-loading 'info_cache' on Instance uuid 7bb0580b-727f-4168-9d56-56dcb4fa404e {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:09:42 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:42 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:43 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:43 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Updating instance_info_cache with network_info: [{"id": "06548a29-a501-4b57-97f1-8afe930c8463", "address": "fa:16:3e:d4:82:7c", "network": {"id": "3bbe159c-29cd-4095-9556-8169500b1716", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-840222337-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f21699c3400842d3a28e71b288a4aaff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap06548a29-a5", "ovs_interfaceid": "06548a29-a501-4b57-97f1-8afe930c8463", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:09:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Releasing lock "refresh_cache-7bb0580b-727f-4168-9d56-56dcb4fa404e" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:09:43 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Updated the network info_cache for instance {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 17 22:09:43 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:09:45 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Acquiring lock "04e443ff-f9f1-4424-847d-ea7557307ec6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "04e443ff-f9f1-4424-847d-ea7557307ec6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:46 user nova-compute[71972]: DEBUG nova.compute.manager [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Starting instance... {{(pid=71972) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 22:09:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:46 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71972) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 22:09:46 user nova-compute[71972]: INFO nova.compute.claims [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Claim successful on node user Apr 17 22:09:46 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Refreshing inventories for resource provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 17 22:09:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Acquiring lock "0c76babb-25fa-4e8b-9e09-c705153a95e5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Lock "0c76babb-25fa-4e8b-9e09-c705153a95e5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:46 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Updating ProviderTree inventory for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 17 22:09:46 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Updating inventory in ProviderTree for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 17 22:09:46 user nova-compute[71972]: DEBUG nova.compute.manager [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Starting instance... {{(pid=71972) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 22:09:46 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Refreshing aggregate associations for resource provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e, aggregates: None {{(pid=71972) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 17 22:09:46 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Refreshing trait associations for resource provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e, traits: HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_RESCUE_BFV,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NODE,HW_CPU_X86_SSE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE2,COMPUTE_VOLUME_MULTI_ATTACH {{(pid=71972) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 17 22:09:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:47 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:09:47 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:09:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.533s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:47 user nova-compute[71972]: DEBUG nova.compute.manager [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Start building networks asynchronously for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 22:09:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.310s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:47 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71972) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 22:09:47 user nova-compute[71972]: INFO nova.compute.claims [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Claim successful on node user Apr 17 22:09:47 user nova-compute[71972]: DEBUG nova.compute.manager [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Allocating IP information in the background. {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 22:09:47 user nova-compute[71972]: DEBUG nova.network.neutron [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] allocate_for_instance() {{(pid=71972) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 22:09:47 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 22:09:47 user nova-compute[71972]: DEBUG nova.compute.manager [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Start building block device mappings for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 22:09:47 user nova-compute[71972]: DEBUG nova.policy [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e11872391f1a487a8a8ba5a6d13589f1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a82db257b3494faca3f3759644a51b30', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71972) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 22:09:47 user nova-compute[71972]: DEBUG nova.compute.manager [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Start spawning the instance on the hypervisor. {{(pid=71972) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 22:09:47 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Creating instance directory {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 22:09:47 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Creating image(s) Apr 17 22:09:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Acquiring lock "/opt/stack/data/nova/instances/04e443ff-f9f1-4424-847d-ea7557307ec6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "/opt/stack/data/nova/instances/04e443ff-f9f1-4424-847d-ea7557307ec6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "/opt/stack/data/nova/instances/04e443ff-f9f1-4424-847d-ea7557307ec6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Acquiring lock "a3f5a6ecbd0b5411fd00bb376a9e9b59fd1f6092" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "a3f5a6ecbd0b5411fd00bb376a9e9b59fd1f6092" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:47 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/a3f5a6ecbd0b5411fd00bb376a9e9b59fd1f6092.part --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:47 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:09:47 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:09:47 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/a3f5a6ecbd0b5411fd00bb376a9e9b59fd1f6092.part --force-share --output=json" returned: 0 in 0.159s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:47 user nova-compute[71972]: DEBUG nova.virt.images [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] 56106bc9-e4d5-47fb-b7a0-f8aadd59975a was qcow2, converting to raw {{(pid=71972) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 17 22:09:47 user nova-compute[71972]: DEBUG nova.privsep.utils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71972) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 17 22:09:47 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/a3f5a6ecbd0b5411fd00bb376a9e9b59fd1f6092.part /opt/stack/data/nova/instances/_base/a3f5a6ecbd0b5411fd00bb376a9e9b59fd1f6092.converted {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.756s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:47 user nova-compute[71972]: DEBUG nova.compute.manager [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Start building networks asynchronously for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 22:09:47 user nova-compute[71972]: DEBUG nova.compute.manager [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Allocating IP information in the background. {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 22:09:47 user nova-compute[71972]: DEBUG nova.network.neutron [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] allocate_for_instance() {{(pid=71972) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/a3f5a6ecbd0b5411fd00bb376a9e9b59fd1f6092.part /opt/stack/data/nova/instances/_base/a3f5a6ecbd0b5411fd00bb376a9e9b59fd1f6092.converted" returned: 0 in 0.148s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/a3f5a6ecbd0b5411fd00bb376a9e9b59fd1f6092.converted --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:48 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 22:09:48 user nova-compute[71972]: DEBUG nova.network.neutron [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Successfully created port: 1f194f69-5d98-4774-958a-1b5b81d978d3 {{(pid=71972) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG nova.compute.manager [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Start building block device mappings for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/a3f5a6ecbd0b5411fd00bb376a9e9b59fd1f6092.converted --force-share --output=json" returned: 0 in 0.142s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "a3f5a6ecbd0b5411fd00bb376a9e9b59fd1f6092" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 0.768s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/a3f5a6ecbd0b5411fd00bb376a9e9b59fd1f6092 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG nova.compute.manager [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Start spawning the instance on the hypervisor. {{(pid=71972) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Creating instance directory {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 22:09:48 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Creating image(s) Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Acquiring lock "/opt/stack/data/nova/instances/0c76babb-25fa-4e8b-9e09-c705153a95e5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Lock "/opt/stack/data/nova/instances/0c76babb-25fa-4e8b-9e09-c705153a95e5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.002s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Lock "/opt/stack/data/nova/instances/0c76babb-25fa-4e8b-9e09-c705153a95e5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/a3f5a6ecbd0b5411fd00bb376a9e9b59fd1f6092 --force-share --output=json" returned: 0 in 0.150s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Acquiring lock "a3f5a6ecbd0b5411fd00bb376a9e9b59fd1f6092" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "a3f5a6ecbd0b5411fd00bb376a9e9b59fd1f6092" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.003s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/a3f5a6ecbd0b5411fd00bb376a9e9b59fd1f6092 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.134s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Acquiring lock "fde46b8e739fd6213d1525690f3ccf27384ee720" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/a3f5a6ecbd0b5411fd00bb376a9e9b59fd1f6092 --force-share --output=json" returned: 0 in 0.148s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/a3f5a6ecbd0b5411fd00bb376a9e9b59fd1f6092,backing_fmt=raw /opt/stack/data/nova/instances/04e443ff-f9f1-4424-847d-ea7557307ec6/disk 1073741824 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.138s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/0c76babb-25fa-4e8b-9e09-c705153a95e5/disk 1073741824 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG nova.policy [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '74aabdff0142403bbb29d96ad103d2f8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '54885ecab9394b39a2a0d287761eda71', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71972) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/a3f5a6ecbd0b5411fd00bb376a9e9b59fd1f6092,backing_fmt=raw /opt/stack/data/nova/instances/04e443ff-f9f1-4424-847d-ea7557307ec6/disk 1073741824" returned: 0 in 0.100s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "a3f5a6ecbd0b5411fd00bb376a9e9b59fd1f6092" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.255s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/a3f5a6ecbd0b5411fd00bb376a9e9b59fd1f6092 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/0c76babb-25fa-4e8b-9e09-c705153a95e5/disk 1073741824" returned: 0 in 0.097s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.245s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/a3f5a6ecbd0b5411fd00bb376a9e9b59fd1f6092 --force-share --output=json" returned: 0 in 0.164s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Checking if we can resize image /opt/stack/data/nova/instances/04e443ff-f9f1-4424-847d-ea7557307ec6/disk. size=1073741824 {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/04e443ff-f9f1-4424-847d-ea7557307ec6/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.160s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Checking if we can resize image /opt/stack/data/nova/instances/0c76babb-25fa-4e8b-9e09-c705153a95e5/disk. size=1073741824 {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0c76babb-25fa-4e8b-9e09-c705153a95e5/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG nova.network.neutron [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Successfully updated port: 1f194f69-5d98-4774-958a-1b5b81d978d3 {{(pid=71972) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Acquiring lock "refresh_cache-04e443ff-f9f1-4424-847d-ea7557307ec6" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Acquired lock "refresh_cache-04e443ff-f9f1-4424-847d-ea7557307ec6" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG nova.network.neutron [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Building network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG nova.compute.manager [req-22b6dd77-0c3d-4efa-bde7-79e5e1c108ec req-f49eaa16-22a8-4ffb-b6e6-b0a171440723 service nova] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Received event network-changed-1f194f69-5d98-4774-958a-1b5b81d978d3 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG nova.compute.manager [req-22b6dd77-0c3d-4efa-bde7-79e5e1c108ec req-f49eaa16-22a8-4ffb-b6e6-b0a171440723 service nova] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Refreshing instance network info cache due to event network-changed-1f194f69-5d98-4774-958a-1b5b81d978d3. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-22b6dd77-0c3d-4efa-bde7-79e5e1c108ec req-f49eaa16-22a8-4ffb-b6e6-b0a171440723 service nova] Acquiring lock "refresh_cache-04e443ff-f9f1-4424-847d-ea7557307ec6" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/04e443ff-f9f1-4424-847d-ea7557307ec6/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Cannot resize image /opt/stack/data/nova/instances/04e443ff-f9f1-4424-847d-ea7557307ec6/disk to a smaller size. {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG nova.objects.instance [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lazy-loading 'migration_context' on Instance uuid 04e443ff-f9f1-4424-847d-ea7557307ec6 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Created local disks {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Ensure instance console log exists: /opt/stack/data/nova/instances/04e443ff-f9f1-4424-847d-ea7557307ec6/console.log {{(pid=71972) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0c76babb-25fa-4e8b-9e09-c705153a95e5/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Cannot resize image /opt/stack/data/nova/instances/0c76babb-25fa-4e8b-9e09-c705153a95e5/disk to a smaller size. {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG nova.objects.instance [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Lazy-loading 'migration_context' on Instance uuid 0c76babb-25fa-4e8b-9e09-c705153a95e5 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG nova.network.neutron [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Instance cache missing network info. {{(pid=71972) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Created local disks {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Ensure instance console log exists: /opt/stack/data/nova/instances/0c76babb-25fa-4e8b-9e09-c705153a95e5/console.log {{(pid=71972) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG nova.network.neutron [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Updating instance_info_cache with network_info: [{"id": "1f194f69-5d98-4774-958a-1b5b81d978d3", "address": "fa:16:3e:21:0f:6f", "network": {"id": "a91184a7-8ecf-419a-a191-2853dd054c4b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1266987381-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a82db257b3494faca3f3759644a51b30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f194f69-5d", "ovs_interfaceid": "1f194f69-5d98-4774-958a-1b5b81d978d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Releasing lock "refresh_cache-04e443ff-f9f1-4424-847d-ea7557307ec6" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG nova.compute.manager [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Instance network_info: |[{"id": "1f194f69-5d98-4774-958a-1b5b81d978d3", "address": "fa:16:3e:21:0f:6f", "network": {"id": "a91184a7-8ecf-419a-a191-2853dd054c4b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1266987381-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a82db257b3494faca3f3759644a51b30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f194f69-5d", "ovs_interfaceid": "1f194f69-5d98-4774-958a-1b5b81d978d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-22b6dd77-0c3d-4efa-bde7-79e5e1c108ec req-f49eaa16-22a8-4ffb-b6e6-b0a171440723 service nova] Acquired lock "refresh_cache-04e443ff-f9f1-4424-847d-ea7557307ec6" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG nova.network.neutron [req-22b6dd77-0c3d-4efa-bde7-79e5e1c108ec req-f49eaa16-22a8-4ffb-b6e6-b0a171440723 service nova] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Refreshing network info cache for port 1f194f69-5d98-4774-958a-1b5b81d978d3 {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Start _get_guest_xml network_info=[{"id": "1f194f69-5d98-4774-958a-1b5b81d978d3", "address": "fa:16:3e:21:0f:6f", "network": {"id": "a91184a7-8ecf-419a-a191-2853dd054c4b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1266987381-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a82db257b3494faca3f3759644a51b30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f194f69-5d", "ovs_interfaceid": "1f194f69-5d98-4774-958a-1b5b81d978d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:09:44Z,direct_url=,disk_format='qcow2',id=56106bc9-e4d5-47fb-b7a0-f8aadd59975a,min_disk=0,min_ram=0,name='tempest-scenario-img--1884796391',owner='a82db257b3494faca3f3759644a51b30',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:09:45Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'image_id': '56106bc9-e4d5-47fb-b7a0-f8aadd59975a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 22:09:49 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:09:49 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:09:49 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71972) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T22:06:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:09:44Z,direct_url=,disk_format='qcow2',id=56106bc9-e4d5-47fb-b7a0-f8aadd59975a,min_disk=0,min_ram=0,name='tempest-scenario-img--1884796391',owner='a82db257b3494faca3f3759644a51b30',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:09:45Z,virtual_size=,visibility=), allow threads: True {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Flavor limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Image limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Flavor pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Image pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Got 1 possible topologies {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:09:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1918210233',display_name='tempest-TestMinimumBasicScenario-server-1918210233',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1918210233',id=9,image_ref='56106bc9-e4d5-47fb-b7a0-f8aadd59975a',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBByiPN5htPjbADhNBJJivdJ60suVrHNjFB0wkgRGT7xEfBp6fHU53P+o170ZE919d8wE4Y4BNeLjkXtGWbkhkDeRYSJIkjhlyNjlASEDFcsNq/wi6hmW0/zfJjZX/wd3og==',key_name='tempest-TestMinimumBasicScenario-706092591',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a82db257b3494faca3f3759644a51b30',ramdisk_id='',reservation_id='r-nri78qe3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='56106bc9-e4d5-47fb-b7a0-f8aadd59975a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-475067891',owner_user_name='tempest-TestMinimumBasicScenario-475067891-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:09:47Z,user_data=None,user_id='e11872391f1a487a8a8ba5a6d13589f1',uuid=04e443ff-f9f1-4424-847d-ea7557307ec6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1f194f69-5d98-4774-958a-1b5b81d978d3", "address": "fa:16:3e:21:0f:6f", "network": {"id": "a91184a7-8ecf-419a-a191-2853dd054c4b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1266987381-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a82db257b3494faca3f3759644a51b30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f194f69-5d", "ovs_interfaceid": "1f194f69-5d98-4774-958a-1b5b81d978d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71972) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Converting VIF {"id": "1f194f69-5d98-4774-958a-1b5b81d978d3", "address": "fa:16:3e:21:0f:6f", "network": {"id": "a91184a7-8ecf-419a-a191-2853dd054c4b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1266987381-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a82db257b3494faca3f3759644a51b30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f194f69-5d", "ovs_interfaceid": "1f194f69-5d98-4774-958a-1b5b81d978d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:21:0f:6f,bridge_name='br-int',has_traffic_filtering=True,id=1f194f69-5d98-4774-958a-1b5b81d978d3,network=Network(a91184a7-8ecf-419a-a191-2853dd054c4b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f194f69-5d') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG nova.objects.instance [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lazy-loading 'pci_devices' on Instance uuid 04e443ff-f9f1-4424-847d-ea7557307ec6 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] End _get_guest_xml xml= Apr 17 22:09:49 user nova-compute[71972]: 04e443ff-f9f1-4424-847d-ea7557307ec6 Apr 17 22:09:49 user nova-compute[71972]: instance-00000009 Apr 17 22:09:49 user nova-compute[71972]: 131072 Apr 17 22:09:49 user nova-compute[71972]: 1 Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: tempest-TestMinimumBasicScenario-server-1918210233 Apr 17 22:09:49 user nova-compute[71972]: 2023-04-17 22:09:49 Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: 128 Apr 17 22:09:49 user nova-compute[71972]: 1 Apr 17 22:09:49 user nova-compute[71972]: 0 Apr 17 22:09:49 user nova-compute[71972]: 0 Apr 17 22:09:49 user nova-compute[71972]: 1 Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: tempest-TestMinimumBasicScenario-475067891-project-member Apr 17 22:09:49 user nova-compute[71972]: tempest-TestMinimumBasicScenario-475067891 Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: OpenStack Foundation Apr 17 22:09:49 user nova-compute[71972]: OpenStack Nova Apr 17 22:09:49 user nova-compute[71972]: 0.0.0 Apr 17 22:09:49 user nova-compute[71972]: 04e443ff-f9f1-4424-847d-ea7557307ec6 Apr 17 22:09:49 user nova-compute[71972]: 04e443ff-f9f1-4424-847d-ea7557307ec6 Apr 17 22:09:49 user nova-compute[71972]: Virtual Machine Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: hvm Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Nehalem Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: /dev/urandom Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: Apr 17 22:09:49 user nova-compute[71972]: {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:09:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1918210233',display_name='tempest-TestMinimumBasicScenario-server-1918210233',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1918210233',id=9,image_ref='56106bc9-e4d5-47fb-b7a0-f8aadd59975a',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBByiPN5htPjbADhNBJJivdJ60suVrHNjFB0wkgRGT7xEfBp6fHU53P+o170ZE919d8wE4Y4BNeLjkXtGWbkhkDeRYSJIkjhlyNjlASEDFcsNq/wi6hmW0/zfJjZX/wd3og==',key_name='tempest-TestMinimumBasicScenario-706092591',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a82db257b3494faca3f3759644a51b30',ramdisk_id='',reservation_id='r-nri78qe3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='56106bc9-e4d5-47fb-b7a0-f8aadd59975a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-475067891',owner_user_name='tempest-TestMinimumBasicScenario-475067891-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:09:47Z,user_data=None,user_id='e11872391f1a487a8a8ba5a6d13589f1',uuid=04e443ff-f9f1-4424-847d-ea7557307ec6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1f194f69-5d98-4774-958a-1b5b81d978d3", "address": "fa:16:3e:21:0f:6f", "network": {"id": "a91184a7-8ecf-419a-a191-2853dd054c4b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1266987381-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a82db257b3494faca3f3759644a51b30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f194f69-5d", "ovs_interfaceid": "1f194f69-5d98-4774-958a-1b5b81d978d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Converting VIF {"id": "1f194f69-5d98-4774-958a-1b5b81d978d3", "address": "fa:16:3e:21:0f:6f", "network": {"id": "a91184a7-8ecf-419a-a191-2853dd054c4b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1266987381-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a82db257b3494faca3f3759644a51b30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f194f69-5d", "ovs_interfaceid": "1f194f69-5d98-4774-958a-1b5b81d978d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:21:0f:6f,bridge_name='br-int',has_traffic_filtering=True,id=1f194f69-5d98-4774-958a-1b5b81d978d3,network=Network(a91184a7-8ecf-419a-a191-2853dd054c4b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f194f69-5d') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG os_vif [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:21:0f:6f,bridge_name='br-int',has_traffic_filtering=True,id=1f194f69-5d98-4774-958a-1b5b81d978d3,network=Network(a91184a7-8ecf-419a-a191-2853dd054c4b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f194f69-5d') {{(pid=71972) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f194f69-5d, may_exist=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1f194f69-5d, col_values=(('external_ids', {'iface-id': '1f194f69-5d98-4774-958a-1b5b81d978d3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:21:0f:6f', 'vm-uuid': '04e443ff-f9f1-4424-847d-ea7557307ec6'}),)) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:49 user nova-compute[71972]: INFO os_vif [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:21:0f:6f,bridge_name='br-int',has_traffic_filtering=True,id=1f194f69-5d98-4774-958a-1b5b81d978d3,network=Network(a91184a7-8ecf-419a-a191-2853dd054c4b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f194f69-5d') Apr 17 22:09:49 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] No BDM found with device name vda, not building metadata. {{(pid=71972) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] No VIF found with MAC fa:16:3e:21:0f:6f, not building metadata {{(pid=71972) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG nova.network.neutron [req-22b6dd77-0c3d-4efa-bde7-79e5e1c108ec req-f49eaa16-22a8-4ffb-b6e6-b0a171440723 service nova] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Updated VIF entry in instance network info cache for port 1f194f69-5d98-4774-958a-1b5b81d978d3. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG nova.network.neutron [req-22b6dd77-0c3d-4efa-bde7-79e5e1c108ec req-f49eaa16-22a8-4ffb-b6e6-b0a171440723 service nova] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Updating instance_info_cache with network_info: [{"id": "1f194f69-5d98-4774-958a-1b5b81d978d3", "address": "fa:16:3e:21:0f:6f", "network": {"id": "a91184a7-8ecf-419a-a191-2853dd054c4b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1266987381-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a82db257b3494faca3f3759644a51b30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f194f69-5d", "ovs_interfaceid": "1f194f69-5d98-4774-958a-1b5b81d978d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:09:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-22b6dd77-0c3d-4efa-bde7-79e5e1c108ec req-f49eaa16-22a8-4ffb-b6e6-b0a171440723 service nova] Releasing lock "refresh_cache-04e443ff-f9f1-4424-847d-ea7557307ec6" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:09:50 user nova-compute[71972]: DEBUG nova.network.neutron [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Successfully created port: 4629d4b1-f472-4302-9bf6-94f62369c1c1 {{(pid=71972) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 22:09:50 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:50 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:50 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.network.neutron [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Successfully updated port: 4629d4b1-f472-4302-9bf6-94f62369c1c1 {{(pid=71972) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Acquiring lock "refresh_cache-0c76babb-25fa-4e8b-9e09-c705153a95e5" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Acquired lock "refresh_cache-0c76babb-25fa-4e8b-9e09-c705153a95e5" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.network.neutron [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Building network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.compute.manager [req-1cfb1270-d0ec-47c2-b617-986e7bd262c4 req-6f796a4f-2967-4640-ac85-c8a44cfca34d service nova] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Received event network-changed-4629d4b1-f472-4302-9bf6-94f62369c1c1 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.compute.manager [req-1cfb1270-d0ec-47c2-b617-986e7bd262c4 req-6f796a4f-2967-4640-ac85-c8a44cfca34d service nova] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Refreshing instance network info cache due to event network-changed-4629d4b1-f472-4302-9bf6-94f62369c1c1. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-1cfb1270-d0ec-47c2-b617-986e7bd262c4 req-6f796a4f-2967-4640-ac85-c8a44cfca34d service nova] Acquiring lock "refresh_cache-0c76babb-25fa-4e8b-9e09-c705153a95e5" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.compute.manager [req-797ce8e5-2fb5-48a6-90dd-160ccdee1fc8 req-c93e8801-2365-4702-b7d9-20d4e7936d1c service nova] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Received event network-vif-plugged-1f194f69-5d98-4774-958a-1b5b81d978d3 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-797ce8e5-2fb5-48a6-90dd-160ccdee1fc8 req-c93e8801-2365-4702-b7d9-20d4e7936d1c service nova] Acquiring lock "04e443ff-f9f1-4424-847d-ea7557307ec6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-797ce8e5-2fb5-48a6-90dd-160ccdee1fc8 req-c93e8801-2365-4702-b7d9-20d4e7936d1c service nova] Lock "04e443ff-f9f1-4424-847d-ea7557307ec6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-797ce8e5-2fb5-48a6-90dd-160ccdee1fc8 req-c93e8801-2365-4702-b7d9-20d4e7936d1c service nova] Lock "04e443ff-f9f1-4424-847d-ea7557307ec6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.compute.manager [req-797ce8e5-2fb5-48a6-90dd-160ccdee1fc8 req-c93e8801-2365-4702-b7d9-20d4e7936d1c service nova] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] No waiting events found dispatching network-vif-plugged-1f194f69-5d98-4774-958a-1b5b81d978d3 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:09:51 user nova-compute[71972]: WARNING nova.compute.manager [req-797ce8e5-2fb5-48a6-90dd-160ccdee1fc8 req-c93e8801-2365-4702-b7d9-20d4e7936d1c service nova] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Received unexpected event network-vif-plugged-1f194f69-5d98-4774-958a-1b5b81d978d3 for instance with vm_state building and task_state spawning. Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.network.neutron [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Instance cache missing network info. {{(pid=71972) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.network.neutron [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Updating instance_info_cache with network_info: [{"id": "4629d4b1-f472-4302-9bf6-94f62369c1c1", "address": "fa:16:3e:a8:48:3e", "network": {"id": "d8b9fec1-bd04-4ad0-a353-9e6f92eee9dd", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1408052112-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "54885ecab9394b39a2a0d287761eda71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4629d4b1-f4", "ovs_interfaceid": "4629d4b1-f472-4302-9bf6-94f62369c1c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Releasing lock "refresh_cache-0c76babb-25fa-4e8b-9e09-c705153a95e5" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.compute.manager [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Instance network_info: |[{"id": "4629d4b1-f472-4302-9bf6-94f62369c1c1", "address": "fa:16:3e:a8:48:3e", "network": {"id": "d8b9fec1-bd04-4ad0-a353-9e6f92eee9dd", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1408052112-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "54885ecab9394b39a2a0d287761eda71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4629d4b1-f4", "ovs_interfaceid": "4629d4b1-f472-4302-9bf6-94f62369c1c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-1cfb1270-d0ec-47c2-b617-986e7bd262c4 req-6f796a4f-2967-4640-ac85-c8a44cfca34d service nova] Acquired lock "refresh_cache-0c76babb-25fa-4e8b-9e09-c705153a95e5" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.network.neutron [req-1cfb1270-d0ec-47c2-b617-986e7bd262c4 req-6f796a4f-2967-4640-ac85-c8a44cfca34d service nova] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Refreshing network info cache for port 4629d4b1-f472-4302-9bf6-94f62369c1c1 {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Start _get_guest_xml network_info=[{"id": "4629d4b1-f472-4302-9bf6-94f62369c1c1", "address": "fa:16:3e:a8:48:3e", "network": {"id": "d8b9fec1-bd04-4ad0-a353-9e6f92eee9dd", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1408052112-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "54885ecab9394b39a2a0d287761eda71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4629d4b1-f4", "ovs_interfaceid": "4629d4b1-f472-4302-9bf6-94f62369c1c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'image_id': '80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 22:09:51 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:09:51 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71972) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T22:06:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=), allow threads: True {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Flavor limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Image limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Flavor pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Image pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Got 1 possible topologies {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:09:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-1085383727',display_name='tempest-VolumesActionsTest-instance-1085383727',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-1085383727',id=10,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='54885ecab9394b39a2a0d287761eda71',ramdisk_id='',reservation_id='r-eucmfo5p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-217042055',owner_user_name='tempest-VolumesActionsTest-217042055-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:09:48Z,user_data=None,user_id='74aabdff0142403bbb29d96ad103d2f8',uuid=0c76babb-25fa-4e8b-9e09-c705153a95e5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4629d4b1-f472-4302-9bf6-94f62369c1c1", "address": "fa:16:3e:a8:48:3e", "network": {"id": "d8b9fec1-bd04-4ad0-a353-9e6f92eee9dd", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1408052112-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "54885ecab9394b39a2a0d287761eda71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4629d4b1-f4", "ovs_interfaceid": "4629d4b1-f472-4302-9bf6-94f62369c1c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71972) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Converting VIF {"id": "4629d4b1-f472-4302-9bf6-94f62369c1c1", "address": "fa:16:3e:a8:48:3e", "network": {"id": "d8b9fec1-bd04-4ad0-a353-9e6f92eee9dd", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1408052112-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "54885ecab9394b39a2a0d287761eda71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4629d4b1-f4", "ovs_interfaceid": "4629d4b1-f472-4302-9bf6-94f62369c1c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:48:3e,bridge_name='br-int',has_traffic_filtering=True,id=4629d4b1-f472-4302-9bf6-94f62369c1c1,network=Network(d8b9fec1-bd04-4ad0-a353-9e6f92eee9dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4629d4b1-f4') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.objects.instance [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Lazy-loading 'pci_devices' on Instance uuid 0c76babb-25fa-4e8b-9e09-c705153a95e5 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] End _get_guest_xml xml= Apr 17 22:09:51 user nova-compute[71972]: 0c76babb-25fa-4e8b-9e09-c705153a95e5 Apr 17 22:09:51 user nova-compute[71972]: instance-0000000a Apr 17 22:09:51 user nova-compute[71972]: 131072 Apr 17 22:09:51 user nova-compute[71972]: 1 Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: tempest-VolumesActionsTest-instance-1085383727 Apr 17 22:09:51 user nova-compute[71972]: 2023-04-17 22:09:51 Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: 128 Apr 17 22:09:51 user nova-compute[71972]: 1 Apr 17 22:09:51 user nova-compute[71972]: 0 Apr 17 22:09:51 user nova-compute[71972]: 0 Apr 17 22:09:51 user nova-compute[71972]: 1 Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: tempest-VolumesActionsTest-217042055-project-member Apr 17 22:09:51 user nova-compute[71972]: tempest-VolumesActionsTest-217042055 Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: OpenStack Foundation Apr 17 22:09:51 user nova-compute[71972]: OpenStack Nova Apr 17 22:09:51 user nova-compute[71972]: 0.0.0 Apr 17 22:09:51 user nova-compute[71972]: 0c76babb-25fa-4e8b-9e09-c705153a95e5 Apr 17 22:09:51 user nova-compute[71972]: 0c76babb-25fa-4e8b-9e09-c705153a95e5 Apr 17 22:09:51 user nova-compute[71972]: Virtual Machine Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: hvm Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Nehalem Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: /dev/urandom Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: Apr 17 22:09:51 user nova-compute[71972]: {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:09:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-1085383727',display_name='tempest-VolumesActionsTest-instance-1085383727',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-1085383727',id=10,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='54885ecab9394b39a2a0d287761eda71',ramdisk_id='',reservation_id='r-eucmfo5p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesActionsTest-217042055',owner_user_name='tempest-VolumesActionsTest-217042055-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:09:48Z,user_data=None,user_id='74aabdff0142403bbb29d96ad103d2f8',uuid=0c76babb-25fa-4e8b-9e09-c705153a95e5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4629d4b1-f472-4302-9bf6-94f62369c1c1", "address": "fa:16:3e:a8:48:3e", "network": {"id": "d8b9fec1-bd04-4ad0-a353-9e6f92eee9dd", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1408052112-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "54885ecab9394b39a2a0d287761eda71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4629d4b1-f4", "ovs_interfaceid": "4629d4b1-f472-4302-9bf6-94f62369c1c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Converting VIF {"id": "4629d4b1-f472-4302-9bf6-94f62369c1c1", "address": "fa:16:3e:a8:48:3e", "network": {"id": "d8b9fec1-bd04-4ad0-a353-9e6f92eee9dd", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1408052112-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "54885ecab9394b39a2a0d287761eda71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4629d4b1-f4", "ovs_interfaceid": "4629d4b1-f472-4302-9bf6-94f62369c1c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:48:3e,bridge_name='br-int',has_traffic_filtering=True,id=4629d4b1-f472-4302-9bf6-94f62369c1c1,network=Network(d8b9fec1-bd04-4ad0-a353-9e6f92eee9dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4629d4b1-f4') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG os_vif [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:48:3e,bridge_name='br-int',has_traffic_filtering=True,id=4629d4b1-f472-4302-9bf6-94f62369c1c1,network=Network(d8b9fec1-bd04-4ad0-a353-9e6f92eee9dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4629d4b1-f4') {{(pid=71972) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4629d4b1-f4, may_exist=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4629d4b1-f4, col_values=(('external_ids', {'iface-id': '4629d4b1-f472-4302-9bf6-94f62369c1c1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a8:48:3e', 'vm-uuid': '0c76babb-25fa-4e8b-9e09-c705153a95e5'}),)) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:51 user nova-compute[71972]: INFO os_vif [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:48:3e,bridge_name='br-int',has_traffic_filtering=True,id=4629d4b1-f472-4302-9bf6-94f62369c1c1,network=Network(d8b9fec1-bd04-4ad0-a353-9e6f92eee9dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4629d4b1-f4') Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] No BDM found with device name vda, not building metadata. {{(pid=71972) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 22:09:51 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] No VIF found with MAC fa:16:3e:a8:48:3e, not building metadata {{(pid=71972) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 22:09:52 user nova-compute[71972]: DEBUG nova.network.neutron [req-1cfb1270-d0ec-47c2-b617-986e7bd262c4 req-6f796a4f-2967-4640-ac85-c8a44cfca34d service nova] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Updated VIF entry in instance network info cache for port 4629d4b1-f472-4302-9bf6-94f62369c1c1. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:09:52 user nova-compute[71972]: DEBUG nova.network.neutron [req-1cfb1270-d0ec-47c2-b617-986e7bd262c4 req-6f796a4f-2967-4640-ac85-c8a44cfca34d service nova] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Updating instance_info_cache with network_info: [{"id": "4629d4b1-f472-4302-9bf6-94f62369c1c1", "address": "fa:16:3e:a8:48:3e", "network": {"id": "d8b9fec1-bd04-4ad0-a353-9e6f92eee9dd", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1408052112-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "54885ecab9394b39a2a0d287761eda71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4629d4b1-f4", "ovs_interfaceid": "4629d4b1-f472-4302-9bf6-94f62369c1c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:09:52 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-1cfb1270-d0ec-47c2-b617-986e7bd262c4 req-6f796a4f-2967-4640-ac85-c8a44cfca34d service nova] Releasing lock "refresh_cache-0c76babb-25fa-4e8b-9e09-c705153a95e5" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:09:52 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:52 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Acquiring lock "d9386728-7c3f-42ff-8f1c-51748ccefff3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:52 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "d9386728-7c3f-42ff-8f1c-51748ccefff3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Resumed> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:09:53 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] VM Resumed (Lifecycle Event) Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.compute.manager [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Instance event wait completed in 0 seconds for {{(pid=71972) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Guest created on hypervisor {{(pid=71972) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.compute.manager [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Starting instance... {{(pid=71972) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:53 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Instance spawned successfully. Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:53 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Started> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:09:53 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] VM Started (Lifecycle Event) Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Found default for hw_cdrom_bus of ide {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Found default for hw_disk_bus of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Found default for hw_input_bus of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Found default for hw_pointer_model of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Found default for hw_video_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Found default for hw_vif_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71972) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 22:09:53 user nova-compute[71972]: INFO nova.compute.claims [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Claim successful on node user Apr 17 22:09:53 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:53 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:09:53 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:53 user nova-compute[71972]: INFO nova.compute.manager [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Took 5.80 seconds to spawn the instance on the hypervisor. Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.compute.manager [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:53 user nova-compute[71972]: INFO nova.compute.manager [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Took 6.81 seconds to build instance. Apr 17 22:09:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-31dd98f3-fc80-42c4-8bb6-5f0daa599fc5 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "04e443ff-f9f1-4424-847d-ea7557307ec6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.933s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.compute.manager [req-f22949c5-23ef-4a39-8306-c68838d8d313 req-165018bd-3447-4573-9415-3a07509065b7 service nova] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Received event network-vif-plugged-1f194f69-5d98-4774-958a-1b5b81d978d3 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-f22949c5-23ef-4a39-8306-c68838d8d313 req-165018bd-3447-4573-9415-3a07509065b7 service nova] Acquiring lock "04e443ff-f9f1-4424-847d-ea7557307ec6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-f22949c5-23ef-4a39-8306-c68838d8d313 req-165018bd-3447-4573-9415-3a07509065b7 service nova] Lock "04e443ff-f9f1-4424-847d-ea7557307ec6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-f22949c5-23ef-4a39-8306-c68838d8d313 req-165018bd-3447-4573-9415-3a07509065b7 service nova] Lock "04e443ff-f9f1-4424-847d-ea7557307ec6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.compute.manager [req-f22949c5-23ef-4a39-8306-c68838d8d313 req-165018bd-3447-4573-9415-3a07509065b7 service nova] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] No waiting events found dispatching network-vif-plugged-1f194f69-5d98-4774-958a-1b5b81d978d3 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:09:53 user nova-compute[71972]: WARNING nova.compute.manager [req-f22949c5-23ef-4a39-8306-c68838d8d313 req-165018bd-3447-4573-9415-3a07509065b7 service nova] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Received unexpected event network-vif-plugged-1f194f69-5d98-4774-958a-1b5b81d978d3 for instance with vm_state active and task_state None. Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.compute.manager [req-f5dcb708-c04f-46a5-a6af-4cb0bd9aab19 req-e4a61f6a-794f-404a-8e1c-c17c35b7ea72 service nova] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Received event network-vif-plugged-4629d4b1-f472-4302-9bf6-94f62369c1c1 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-f5dcb708-c04f-46a5-a6af-4cb0bd9aab19 req-e4a61f6a-794f-404a-8e1c-c17c35b7ea72 service nova] Acquiring lock "0c76babb-25fa-4e8b-9e09-c705153a95e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-f5dcb708-c04f-46a5-a6af-4cb0bd9aab19 req-e4a61f6a-794f-404a-8e1c-c17c35b7ea72 service nova] Lock "0c76babb-25fa-4e8b-9e09-c705153a95e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-f5dcb708-c04f-46a5-a6af-4cb0bd9aab19 req-e4a61f6a-794f-404a-8e1c-c17c35b7ea72 service nova] Lock "0c76babb-25fa-4e8b-9e09-c705153a95e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.compute.manager [req-f5dcb708-c04f-46a5-a6af-4cb0bd9aab19 req-e4a61f6a-794f-404a-8e1c-c17c35b7ea72 service nova] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] No waiting events found dispatching network-vif-plugged-4629d4b1-f472-4302-9bf6-94f62369c1c1 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:09:53 user nova-compute[71972]: WARNING nova.compute.manager [req-f5dcb708-c04f-46a5-a6af-4cb0bd9aab19 req-e4a61f6a-794f-404a-8e1c-c17c35b7ea72 service nova] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Received unexpected event network-vif-plugged-4629d4b1-f472-4302-9bf6-94f62369c1c1 for instance with vm_state building and task_state spawning. Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.511s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.compute.manager [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Start building networks asynchronously for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.compute.manager [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Allocating IP information in the background. {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.network.neutron [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] allocate_for_instance() {{(pid=71972) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 22:09:53 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.compute.manager [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Start building block device mappings for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.policy [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '70af0dc4dbf24ae1add76f3c87f8b1b5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '416fcd7cd2bc486884f751acab268fd8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71972) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.compute.manager [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Start spawning the instance on the hypervisor. {{(pid=71972) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Creating instance directory {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 22:09:53 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Creating image(s) Apr 17 22:09:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Acquiring lock "/opt/stack/data/nova/instances/d9386728-7c3f-42ff-8f1c-51748ccefff3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "/opt/stack/data/nova/instances/d9386728-7c3f-42ff-8f1c-51748ccefff3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "/opt/stack/data/nova/instances/d9386728-7c3f-42ff-8f1c-51748ccefff3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.129s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Acquiring lock "fde46b8e739fd6213d1525690f3ccf27384ee720" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:53 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:54 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.129s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:54 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/d9386728-7c3f-42ff-8f1c-51748ccefff3/disk 1073741824 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:54 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/d9386728-7c3f-42ff-8f1c-51748ccefff3/disk 1073741824" returned: 0 in 0.044s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:54 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.180s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:54 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:54 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.129s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:54 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Checking if we can resize image /opt/stack/data/nova/instances/d9386728-7c3f-42ff-8f1c-51748ccefff3/disk. size=1073741824 {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 22:09:54 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d9386728-7c3f-42ff-8f1c-51748ccefff3/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:09:54 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d9386728-7c3f-42ff-8f1c-51748ccefff3/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:09:54 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Cannot resize image /opt/stack/data/nova/instances/d9386728-7c3f-42ff-8f1c-51748ccefff3/disk to a smaller size. {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 22:09:54 user nova-compute[71972]: DEBUG nova.objects.instance [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lazy-loading 'migration_context' on Instance uuid d9386728-7c3f-42ff-8f1c-51748ccefff3 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:09:54 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Created local disks {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 22:09:54 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Ensure instance console log exists: /opt/stack/data/nova/instances/d9386728-7c3f-42ff-8f1c-51748ccefff3/console.log {{(pid=71972) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 22:09:54 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:54 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:54 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:54 user nova-compute[71972]: DEBUG nova.network.neutron [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Successfully created port: 54d6ec48-0412-4678-9745-e657a446347d {{(pid=71972) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 22:09:55 user nova-compute[71972]: DEBUG nova.compute.manager [req-62a9011c-f621-4076-86a3-51f6ae6a436f req-da131d2d-e5f3-49b6-9e85-395d1d3c56cb service nova] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Received event network-vif-plugged-4629d4b1-f472-4302-9bf6-94f62369c1c1 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-62a9011c-f621-4076-86a3-51f6ae6a436f req-da131d2d-e5f3-49b6-9e85-395d1d3c56cb service nova] Acquiring lock "0c76babb-25fa-4e8b-9e09-c705153a95e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-62a9011c-f621-4076-86a3-51f6ae6a436f req-da131d2d-e5f3-49b6-9e85-395d1d3c56cb service nova] Lock "0c76babb-25fa-4e8b-9e09-c705153a95e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-62a9011c-f621-4076-86a3-51f6ae6a436f req-da131d2d-e5f3-49b6-9e85-395d1d3c56cb service nova] Lock "0c76babb-25fa-4e8b-9e09-c705153a95e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:55 user nova-compute[71972]: DEBUG nova.compute.manager [req-62a9011c-f621-4076-86a3-51f6ae6a436f req-da131d2d-e5f3-49b6-9e85-395d1d3c56cb service nova] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] No waiting events found dispatching network-vif-plugged-4629d4b1-f472-4302-9bf6-94f62369c1c1 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:09:55 user nova-compute[71972]: WARNING nova.compute.manager [req-62a9011c-f621-4076-86a3-51f6ae6a436f req-da131d2d-e5f3-49b6-9e85-395d1d3c56cb service nova] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Received unexpected event network-vif-plugged-4629d4b1-f472-4302-9bf6-94f62369c1c1 for instance with vm_state building and task_state spawning. Apr 17 22:09:55 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Resumed> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:09:55 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] VM Resumed (Lifecycle Event) Apr 17 22:09:55 user nova-compute[71972]: DEBUG nova.compute.manager [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Instance event wait completed in 0 seconds for {{(pid=71972) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 22:09:55 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Guest created on hypervisor {{(pid=71972) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 22:09:55 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Instance spawned successfully. Apr 17 22:09:55 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 22:09:55 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:55 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:09:55 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Found default for hw_cdrom_bus of ide {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:55 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Found default for hw_disk_bus of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:55 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Found default for hw_input_bus of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:55 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Found default for hw_pointer_model of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:55 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Found default for hw_video_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:55 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Found default for hw_vif_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:09:55 user nova-compute[71972]: DEBUG nova.network.neutron [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Successfully updated port: 54d6ec48-0412-4678-9745-e657a446347d {{(pid=71972) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 22:09:55 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:09:55 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Started> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:09:55 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] VM Started (Lifecycle Event) Apr 17 22:09:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Acquiring lock "refresh_cache-d9386728-7c3f-42ff-8f1c-51748ccefff3" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:09:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Acquired lock "refresh_cache-d9386728-7c3f-42ff-8f1c-51748ccefff3" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:09:55 user nova-compute[71972]: DEBUG nova.network.neutron [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Building network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 22:09:55 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:55 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:09:55 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:09:55 user nova-compute[71972]: INFO nova.compute.manager [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Took 7.55 seconds to spawn the instance on the hypervisor. Apr 17 22:09:55 user nova-compute[71972]: DEBUG nova.compute.manager [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:09:55 user nova-compute[71972]: DEBUG nova.network.neutron [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Instance cache missing network info. {{(pid=71972) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 22:09:55 user nova-compute[71972]: INFO nova.compute.manager [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Took 9.04 seconds to build instance. Apr 17 22:09:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9fa9f74-232e-48ff-8c07-87a7032573fc tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Lock "0c76babb-25fa-4e8b-9e09-c705153a95e5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 9.137s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG nova.network.neutron [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Updating instance_info_cache with network_info: [{"id": "54d6ec48-0412-4678-9745-e657a446347d", "address": "fa:16:3e:c6:26:04", "network": {"id": "83a63ee2-d2cb-414d-bab9-c556ee1c2c88", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-625420379-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "416fcd7cd2bc486884f751acab268fd8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d6ec48-04", "ovs_interfaceid": "54d6ec48-0412-4678-9745-e657a446347d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Releasing lock "refresh_cache-d9386728-7c3f-42ff-8f1c-51748ccefff3" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG nova.compute.manager [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Instance network_info: |[{"id": "54d6ec48-0412-4678-9745-e657a446347d", "address": "fa:16:3e:c6:26:04", "network": {"id": "83a63ee2-d2cb-414d-bab9-c556ee1c2c88", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-625420379-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "416fcd7cd2bc486884f751acab268fd8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d6ec48-04", "ovs_interfaceid": "54d6ec48-0412-4678-9745-e657a446347d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Start _get_guest_xml network_info=[{"id": "54d6ec48-0412-4678-9745-e657a446347d", "address": "fa:16:3e:c6:26:04", "network": {"id": "83a63ee2-d2cb-414d-bab9-c556ee1c2c88", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-625420379-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "416fcd7cd2bc486884f751acab268fd8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d6ec48-04", "ovs_interfaceid": "54d6ec48-0412-4678-9745-e657a446347d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'image_id': '80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 22:09:56 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:09:56 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:09:56 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71972) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T22:06:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=), allow threads: True {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Flavor limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Image limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Flavor pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Image pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Got 1 possible topologies {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:09:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1397556093',display_name='tempest-VolumesAdminNegativeTest-server-1397556093',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-1397556093',id=11,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDFeeT2ktb2ezEbDneceaEkdmYL0Q+OMfaTSuKEyaa3rHRe7RFynAalW4DVkVwOj9o4faZsUrmyKgb2c54eejkRheCP0UOpP4kS7Z/K/ZA1t+erJPufbj/hsftZaYU+qdg==',key_name='tempest-keypair-1691187973',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='416fcd7cd2bc486884f751acab268fd8',ramdisk_id='',reservation_id='r-yqs02rbn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-315022411',owner_user_name='tempest-VolumesAdminNegativeTest-315022411-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:09:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='70af0dc4dbf24ae1add76f3c87f8b1b5',uuid=d9386728-7c3f-42ff-8f1c-51748ccefff3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "54d6ec48-0412-4678-9745-e657a446347d", "address": "fa:16:3e:c6:26:04", "network": {"id": "83a63ee2-d2cb-414d-bab9-c556ee1c2c88", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-625420379-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "416fcd7cd2bc486884f751acab268fd8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d6ec48-04", "ovs_interfaceid": "54d6ec48-0412-4678-9745-e657a446347d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71972) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Converting VIF {"id": "54d6ec48-0412-4678-9745-e657a446347d", "address": "fa:16:3e:c6:26:04", "network": {"id": "83a63ee2-d2cb-414d-bab9-c556ee1c2c88", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-625420379-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "416fcd7cd2bc486884f751acab268fd8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d6ec48-04", "ovs_interfaceid": "54d6ec48-0412-4678-9745-e657a446347d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:26:04,bridge_name='br-int',has_traffic_filtering=True,id=54d6ec48-0412-4678-9745-e657a446347d,network=Network(83a63ee2-d2cb-414d-bab9-c556ee1c2c88),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54d6ec48-04') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG nova.objects.instance [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lazy-loading 'pci_devices' on Instance uuid d9386728-7c3f-42ff-8f1c-51748ccefff3 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] End _get_guest_xml xml= Apr 17 22:09:56 user nova-compute[71972]: d9386728-7c3f-42ff-8f1c-51748ccefff3 Apr 17 22:09:56 user nova-compute[71972]: instance-0000000b Apr 17 22:09:56 user nova-compute[71972]: 131072 Apr 17 22:09:56 user nova-compute[71972]: 1 Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: tempest-VolumesAdminNegativeTest-server-1397556093 Apr 17 22:09:56 user nova-compute[71972]: 2023-04-17 22:09:56 Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: 128 Apr 17 22:09:56 user nova-compute[71972]: 1 Apr 17 22:09:56 user nova-compute[71972]: 0 Apr 17 22:09:56 user nova-compute[71972]: 0 Apr 17 22:09:56 user nova-compute[71972]: 1 Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: tempest-VolumesAdminNegativeTest-315022411-project-member Apr 17 22:09:56 user nova-compute[71972]: tempest-VolumesAdminNegativeTest-315022411 Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: OpenStack Foundation Apr 17 22:09:56 user nova-compute[71972]: OpenStack Nova Apr 17 22:09:56 user nova-compute[71972]: 0.0.0 Apr 17 22:09:56 user nova-compute[71972]: d9386728-7c3f-42ff-8f1c-51748ccefff3 Apr 17 22:09:56 user nova-compute[71972]: d9386728-7c3f-42ff-8f1c-51748ccefff3 Apr 17 22:09:56 user nova-compute[71972]: Virtual Machine Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: hvm Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Nehalem Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: /dev/urandom Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: Apr 17 22:09:56 user nova-compute[71972]: {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:09:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1397556093',display_name='tempest-VolumesAdminNegativeTest-server-1397556093',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-1397556093',id=11,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDFeeT2ktb2ezEbDneceaEkdmYL0Q+OMfaTSuKEyaa3rHRe7RFynAalW4DVkVwOj9o4faZsUrmyKgb2c54eejkRheCP0UOpP4kS7Z/K/ZA1t+erJPufbj/hsftZaYU+qdg==',key_name='tempest-keypair-1691187973',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='416fcd7cd2bc486884f751acab268fd8',ramdisk_id='',reservation_id='r-yqs02rbn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-315022411',owner_user_name='tempest-VolumesAdminNegativeTest-315022411-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:09:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='70af0dc4dbf24ae1add76f3c87f8b1b5',uuid=d9386728-7c3f-42ff-8f1c-51748ccefff3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "54d6ec48-0412-4678-9745-e657a446347d", "address": "fa:16:3e:c6:26:04", "network": {"id": "83a63ee2-d2cb-414d-bab9-c556ee1c2c88", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-625420379-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "416fcd7cd2bc486884f751acab268fd8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d6ec48-04", "ovs_interfaceid": "54d6ec48-0412-4678-9745-e657a446347d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Converting VIF {"id": "54d6ec48-0412-4678-9745-e657a446347d", "address": "fa:16:3e:c6:26:04", "network": {"id": "83a63ee2-d2cb-414d-bab9-c556ee1c2c88", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-625420379-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "416fcd7cd2bc486884f751acab268fd8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d6ec48-04", "ovs_interfaceid": "54d6ec48-0412-4678-9745-e657a446347d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:26:04,bridge_name='br-int',has_traffic_filtering=True,id=54d6ec48-0412-4678-9745-e657a446347d,network=Network(83a63ee2-d2cb-414d-bab9-c556ee1c2c88),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54d6ec48-04') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG os_vif [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:26:04,bridge_name='br-int',has_traffic_filtering=True,id=54d6ec48-0412-4678-9745-e657a446347d,network=Network(83a63ee2-d2cb-414d-bab9-c556ee1c2c88),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54d6ec48-04') {{(pid=71972) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap54d6ec48-04, may_exist=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap54d6ec48-04, col_values=(('external_ids', {'iface-id': '54d6ec48-0412-4678-9745-e657a446347d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c6:26:04', 'vm-uuid': 'd9386728-7c3f-42ff-8f1c-51748ccefff3'}),)) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:56 user nova-compute[71972]: INFO os_vif [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:26:04,bridge_name='br-int',has_traffic_filtering=True,id=54d6ec48-0412-4678-9745-e657a446347d,network=Network(83a63ee2-d2cb-414d-bab9-c556ee1c2c88),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54d6ec48-04') Apr 17 22:09:56 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] No BDM found with device name vda, not building metadata. {{(pid=71972) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 22:09:56 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] No VIF found with MAC fa:16:3e:c6:26:04, not building metadata {{(pid=71972) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 22:09:57 user nova-compute[71972]: DEBUG nova.compute.manager [req-221539b0-aed7-4fe8-865f-eb619554d0a3 req-98076cc2-621f-4f45-b41d-9975fa2e0133 service nova] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Received event network-changed-54d6ec48-0412-4678-9745-e657a446347d {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:57 user nova-compute[71972]: DEBUG nova.compute.manager [req-221539b0-aed7-4fe8-865f-eb619554d0a3 req-98076cc2-621f-4f45-b41d-9975fa2e0133 service nova] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Refreshing instance network info cache due to event network-changed-54d6ec48-0412-4678-9745-e657a446347d. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:09:57 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-221539b0-aed7-4fe8-865f-eb619554d0a3 req-98076cc2-621f-4f45-b41d-9975fa2e0133 service nova] Acquiring lock "refresh_cache-d9386728-7c3f-42ff-8f1c-51748ccefff3" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:09:57 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-221539b0-aed7-4fe8-865f-eb619554d0a3 req-98076cc2-621f-4f45-b41d-9975fa2e0133 service nova] Acquired lock "refresh_cache-d9386728-7c3f-42ff-8f1c-51748ccefff3" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:09:57 user nova-compute[71972]: DEBUG nova.network.neutron [req-221539b0-aed7-4fe8-865f-eb619554d0a3 req-98076cc2-621f-4f45-b41d-9975fa2e0133 service nova] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Refreshing network info cache for port 54d6ec48-0412-4678-9745-e657a446347d {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:09:57 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:57 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:57 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:57 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:58 user nova-compute[71972]: DEBUG nova.network.neutron [req-221539b0-aed7-4fe8-865f-eb619554d0a3 req-98076cc2-621f-4f45-b41d-9975fa2e0133 service nova] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Updated VIF entry in instance network info cache for port 54d6ec48-0412-4678-9745-e657a446347d. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:09:58 user nova-compute[71972]: DEBUG nova.network.neutron [req-221539b0-aed7-4fe8-865f-eb619554d0a3 req-98076cc2-621f-4f45-b41d-9975fa2e0133 service nova] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Updating instance_info_cache with network_info: [{"id": "54d6ec48-0412-4678-9745-e657a446347d", "address": "fa:16:3e:c6:26:04", "network": {"id": "83a63ee2-d2cb-414d-bab9-c556ee1c2c88", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-625420379-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "416fcd7cd2bc486884f751acab268fd8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d6ec48-04", "ovs_interfaceid": "54d6ec48-0412-4678-9745-e657a446347d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:09:58 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-221539b0-aed7-4fe8-865f-eb619554d0a3 req-98076cc2-621f-4f45-b41d-9975fa2e0133 service nova] Releasing lock "refresh_cache-d9386728-7c3f-42ff-8f1c-51748ccefff3" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:09:58 user nova-compute[71972]: DEBUG nova.compute.manager [req-3cb7ca60-31e2-4427-9eb6-ddafdd2ba607 req-6e845d91-70e5-4687-ba84-c3d383aeec8c service nova] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Received event network-vif-plugged-54d6ec48-0412-4678-9745-e657a446347d {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:09:58 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-3cb7ca60-31e2-4427-9eb6-ddafdd2ba607 req-6e845d91-70e5-4687-ba84-c3d383aeec8c service nova] Acquiring lock "d9386728-7c3f-42ff-8f1c-51748ccefff3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:09:58 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-3cb7ca60-31e2-4427-9eb6-ddafdd2ba607 req-6e845d91-70e5-4687-ba84-c3d383aeec8c service nova] Lock "d9386728-7c3f-42ff-8f1c-51748ccefff3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:09:58 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-3cb7ca60-31e2-4427-9eb6-ddafdd2ba607 req-6e845d91-70e5-4687-ba84-c3d383aeec8c service nova] Lock "d9386728-7c3f-42ff-8f1c-51748ccefff3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:09:58 user nova-compute[71972]: DEBUG nova.compute.manager [req-3cb7ca60-31e2-4427-9eb6-ddafdd2ba607 req-6e845d91-70e5-4687-ba84-c3d383aeec8c service nova] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] No waiting events found dispatching network-vif-plugged-54d6ec48-0412-4678-9745-e657a446347d {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:09:58 user nova-compute[71972]: WARNING nova.compute.manager [req-3cb7ca60-31e2-4427-9eb6-ddafdd2ba607 req-6e845d91-70e5-4687-ba84-c3d383aeec8c service nova] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Received unexpected event network-vif-plugged-54d6ec48-0412-4678-9745-e657a446347d for instance with vm_state building and task_state spawning. Apr 17 22:09:58 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:58 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:58 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:09:58 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:00 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Resumed> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:10:00 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] VM Resumed (Lifecycle Event) Apr 17 22:10:00 user nova-compute[71972]: DEBUG nova.compute.manager [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Instance event wait completed in 0 seconds for {{(pid=71972) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 22:10:00 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Guest created on hypervisor {{(pid=71972) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 22:10:00 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Instance spawned successfully. Apr 17 22:10:00 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 22:10:00 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:10:00 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Found default for hw_cdrom_bus of ide {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:10:00 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Found default for hw_disk_bus of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:10:00 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Found default for hw_input_bus of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:10:00 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Found default for hw_pointer_model of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:10:00 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Found default for hw_video_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:10:00 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Found default for hw_vif_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:10:00 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:10:00 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:10:00 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Started> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:10:00 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] VM Started (Lifecycle Event) Apr 17 22:10:00 user nova-compute[71972]: DEBUG nova.compute.manager [req-ee63eb8e-81c8-41fe-8b25-31e49618e96b req-2d747f1a-197f-4abc-a484-f545af62f637 service nova] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Received event network-vif-plugged-54d6ec48-0412-4678-9745-e657a446347d {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:10:00 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-ee63eb8e-81c8-41fe-8b25-31e49618e96b req-2d747f1a-197f-4abc-a484-f545af62f637 service nova] Acquiring lock "d9386728-7c3f-42ff-8f1c-51748ccefff3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:00 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-ee63eb8e-81c8-41fe-8b25-31e49618e96b req-2d747f1a-197f-4abc-a484-f545af62f637 service nova] Lock "d9386728-7c3f-42ff-8f1c-51748ccefff3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:00 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-ee63eb8e-81c8-41fe-8b25-31e49618e96b req-2d747f1a-197f-4abc-a484-f545af62f637 service nova] Lock "d9386728-7c3f-42ff-8f1c-51748ccefff3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:00 user nova-compute[71972]: DEBUG nova.compute.manager [req-ee63eb8e-81c8-41fe-8b25-31e49618e96b req-2d747f1a-197f-4abc-a484-f545af62f637 service nova] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] No waiting events found dispatching network-vif-plugged-54d6ec48-0412-4678-9745-e657a446347d {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:10:00 user nova-compute[71972]: WARNING nova.compute.manager [req-ee63eb8e-81c8-41fe-8b25-31e49618e96b req-2d747f1a-197f-4abc-a484-f545af62f637 service nova] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Received unexpected event network-vif-plugged-54d6ec48-0412-4678-9745-e657a446347d for instance with vm_state building and task_state spawning. Apr 17 22:10:00 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:10:00 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:10:00 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:10:00 user nova-compute[71972]: INFO nova.compute.manager [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Took 6.47 seconds to spawn the instance on the hypervisor. Apr 17 22:10:00 user nova-compute[71972]: DEBUG nova.compute.manager [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:10:00 user nova-compute[71972]: INFO nova.compute.manager [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Took 7.32 seconds to build instance. Apr 17 22:10:00 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6d59a6c8-4934-4cce-9955-507954e1b01a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "d9386728-7c3f-42ff-8f1c-51748ccefff3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.442s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:01 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:06 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:07 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:12 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:16 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:17 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:21 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:22 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:26 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:27 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:33 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:10:34 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:10:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:36 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:10:36 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:10:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:36 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Auditing locally available compute resources for user (node: user) {{(pid=71972) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 22:10:36 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:10:36 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:10:36 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:10:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:10:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3681b009-1a99-4eb8-b189-3fe0647f5d1d/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:10:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3681b009-1a99-4eb8-b189-3fe0647f5d1d/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:10:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3681b009-1a99-4eb8-b189-3fe0647f5d1d/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:10:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/3681b009-1a99-4eb8-b189-3fe0647f5d1d/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:10:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:10:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:10:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:10:37 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:10:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b2751b9c-c966-416d-aaaa-81756198849c/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:10:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b2751b9c-c966-416d-aaaa-81756198849c/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:10:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b2751b9c-c966-416d-aaaa-81756198849c/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:10:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/b2751b9c-c966-416d-aaaa-81756198849c/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:10:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0c76babb-25fa-4e8b-9e09-c705153a95e5/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:10:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0c76babb-25fa-4e8b-9e09-c705153a95e5/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:10:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0c76babb-25fa-4e8b-9e09-c705153a95e5/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:10:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0c76babb-25fa-4e8b-9e09-c705153a95e5/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:10:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cbaa3995-f00d-4194-b7e2-29bfc6e27614/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:10:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cbaa3995-f00d-4194-b7e2-29bfc6e27614/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:10:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cbaa3995-f00d-4194-b7e2-29bfc6e27614/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:10:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cbaa3995-f00d-4194-b7e2-29bfc6e27614/disk --force-share --output=json" returned: 0 in 0.128s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:10:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/04e443ff-f9f1-4424-847d-ea7557307ec6/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:10:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/04e443ff-f9f1-4424-847d-ea7557307ec6/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:10:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/04e443ff-f9f1-4424-847d-ea7557307ec6/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:10:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/04e443ff-f9f1-4424-847d-ea7557307ec6/disk --force-share --output=json" returned: 0 in 0.126s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:10:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:10:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:10:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:10:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:10:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d9386728-7c3f-42ff-8f1c-51748ccefff3/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:10:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d9386728-7c3f-42ff-8f1c-51748ccefff3/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:10:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d9386728-7c3f-42ff-8f1c-51748ccefff3/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:10:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d9386728-7c3f-42ff-8f1c-51748ccefff3/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:10:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4fc7798-567a-4002-b056-6c4f02d0e955/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:10:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4fc7798-567a-4002-b056-6c4f02d0e955/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:10:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4fc7798-567a-4002-b056-6c4f02d0e955/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:10:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/c4fc7798-567a-4002-b056-6c4f02d0e955/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:10:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:10:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:10:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:10:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:10:40 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:10:40 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:10:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Hypervisor/Node resource view: name=user free_ram=7436MB free_disk=26.491355895996094GB free_vcpus=1 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71972) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 22:10:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance d9386728-7c3f-42ff-8f1c-51748ccefff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:10:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance c4fc7798-567a-4002-b056-6c4f02d0e955 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:10:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 0c76babb-25fa-4e8b-9e09-c705153a95e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:10:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance cbaa3995-f00d-4194-b7e2-29bfc6e27614 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:10:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 04e443ff-f9f1-4424-847d-ea7557307ec6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:10:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 9634492c-168d-4b49-941a-b89703571b73 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:10:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 97067629-e099-49fd-bb79-223dd4401405 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:10:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 99cb7131-abb8-41d6-bddd-c3bc943b7678 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:10:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 3681b009-1a99-4eb8-b189-3fe0647f5d1d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:10:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 7bb0580b-727f-4168-9d56-56dcb4fa404e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:10:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance b2751b9c-c966-416d-aaaa-81756198849c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:10:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Total usable vcpus: 12, total allocated vcpus: 11 {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 22:10:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Final resource view: name=user phys_ram=16023MB used_ram=1920MB phys_disk=40GB used_disk=11GB total_vcpus=12 used_vcpus=11 pci_stats=[] {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 22:10:40 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:10:40 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:10:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Compute_service record updated for user:user {{(pid=71972) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 22:10:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.557s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:41 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:41 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:10:41 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:10:41 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:10:41 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Starting heal instance info cache {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 22:10:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "refresh_cache-3681b009-1a99-4eb8-b189-3fe0647f5d1d" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:10:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquired lock "refresh_cache-3681b009-1a99-4eb8-b189-3fe0647f5d1d" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:10:42 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Forcefully refreshing network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 17 22:10:42 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Updating instance_info_cache with network_info: [{"id": "7557dbee-f2e2-47a0-88eb-1377350f8504", "address": "fa:16:3e:55:dc:d0", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7557dbee-f2", "ovs_interfaceid": "7557dbee-f2e2-47a0-88eb-1377350f8504", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:10:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Releasing lock "refresh_cache-3681b009-1a99-4eb8-b189-3fe0647f5d1d" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:10:42 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Updated the network info_cache for instance {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 17 22:10:42 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:10:42 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:10:42 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:10:42 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71972) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 22:10:42 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:45 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Acquiring lock "d7c93039-48e6-40b8-b921-d6eb1ebe78ef" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:45 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "d7c93039-48e6-40b8-b921-d6eb1ebe78ef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:45 user nova-compute[71972]: DEBUG nova.compute.manager [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Starting instance... {{(pid=71972) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 22:10:45 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:45 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:45 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71972) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 22:10:45 user nova-compute[71972]: INFO nova.compute.claims [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Claim successful on node user Apr 17 22:10:46 user nova-compute[71972]: DEBUG nova.compute.manager [req-83c4eecc-9ba9-4254-b8d4-6457dc971d60 req-4edcc5c6-ce04-44c2-a83e-2d2c0d64c814 service nova] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Received event network-changed-7557dbee-f2e2-47a0-88eb-1377350f8504 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG nova.compute.manager [req-83c4eecc-9ba9-4254-b8d4-6457dc971d60 req-4edcc5c6-ce04-44c2-a83e-2d2c0d64c814 service nova] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Refreshing instance network info cache due to event network-changed-7557dbee-f2e2-47a0-88eb-1377350f8504. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-83c4eecc-9ba9-4254-b8d4-6457dc971d60 req-4edcc5c6-ce04-44c2-a83e-2d2c0d64c814 service nova] Acquiring lock "refresh_cache-3681b009-1a99-4eb8-b189-3fe0647f5d1d" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-83c4eecc-9ba9-4254-b8d4-6457dc971d60 req-4edcc5c6-ce04-44c2-a83e-2d2c0d64c814 service nova] Acquired lock "refresh_cache-3681b009-1a99-4eb8-b189-3fe0647f5d1d" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG nova.network.neutron [req-83c4eecc-9ba9-4254-b8d4-6457dc971d60 req-4edcc5c6-ce04-44c2-a83e-2d2c0d64c814 service nova] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Refreshing network info cache for port 7557dbee-f2e2-47a0-88eb-1377350f8504 {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.529s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG nova.compute.manager [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Start building networks asynchronously for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG nova.compute.manager [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Allocating IP information in the background. {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG nova.network.neutron [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] allocate_for_instance() {{(pid=71972) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 22:10:46 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 22:10:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG nova.compute.manager [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Start building block device mappings for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG nova.compute.manager [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Start spawning the instance on the hypervisor. {{(pid=71972) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Creating instance directory {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 22:10:46 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Creating image(s) Apr 17 22:10:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Acquiring lock "/opt/stack/data/nova/instances/d7c93039-48e6-40b8-b921-d6eb1ebe78ef/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "/opt/stack/data/nova/instances/d7c93039-48e6-40b8-b921-d6eb1ebe78ef/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "/opt/stack/data/nova/instances/d7c93039-48e6-40b8-b921-d6eb1ebe78ef/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG nova.policy [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '51c0b269c97241d9ad122b23af3ca7ea', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f21699c3400842d3a28e71b288a4aaff', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71972) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG nova.compute.manager [req-9b0b70d3-0bcf-41bd-aca0-457f36a7559f req-c23aea8b-384a-4e03-852a-55e4a48031a5 service nova] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Received event network-changed-aede8066-45b3-4414-98a0-50dda5a4ee66 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG nova.compute.manager [req-9b0b70d3-0bcf-41bd-aca0-457f36a7559f req-c23aea8b-384a-4e03-852a-55e4a48031a5 service nova] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Refreshing instance network info cache due to event network-changed-aede8066-45b3-4414-98a0-50dda5a4ee66. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-9b0b70d3-0bcf-41bd-aca0-457f36a7559f req-c23aea8b-384a-4e03-852a-55e4a48031a5 service nova] Acquiring lock "refresh_cache-99cb7131-abb8-41d6-bddd-c3bc943b7678" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-9b0b70d3-0bcf-41bd-aca0-457f36a7559f req-c23aea8b-384a-4e03-852a-55e4a48031a5 service nova] Acquired lock "refresh_cache-99cb7131-abb8-41d6-bddd-c3bc943b7678" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG nova.network.neutron [req-9b0b70d3-0bcf-41bd-aca0-457f36a7559f req-c23aea8b-384a-4e03-852a-55e4a48031a5 service nova] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Refreshing network info cache for port aede8066-45b3-4414-98a0-50dda5a4ee66 {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3ba7f96d-ad1b-425d-a010-93c36f3d540b tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Acquiring lock "c4fc7798-567a-4002-b056-6c4f02d0e955" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3ba7f96d-ad1b-425d-a010-93c36f3d540b tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Lock "c4fc7798-567a-4002-b056-6c4f02d0e955" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3ba7f96d-ad1b-425d-a010-93c36f3d540b tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Acquiring lock "c4fc7798-567a-4002-b056-6c4f02d0e955-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3ba7f96d-ad1b-425d-a010-93c36f3d540b tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Lock "c4fc7798-567a-4002-b056-6c4f02d0e955-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3ba7f96d-ad1b-425d-a010-93c36f3d540b tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Lock "c4fc7798-567a-4002-b056-6c4f02d0e955-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:46 user nova-compute[71972]: INFO nova.compute.manager [None req-3ba7f96d-ad1b-425d-a010-93c36f3d540b tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Terminating instance Apr 17 22:10:46 user nova-compute[71972]: DEBUG nova.compute.manager [None req-3ba7f96d-ad1b-425d-a010-93c36f3d540b tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Start destroying the instance on the hypervisor. {{(pid=71972) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.147s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Acquiring lock "fde46b8e739fd6213d1525690f3ccf27384ee720" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.137s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/d7c93039-48e6-40b8-b921-d6eb1ebe78ef/disk 1073741824 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/d7c93039-48e6-40b8-b921-d6eb1ebe78ef/disk 1073741824" returned: 0 in 0.052s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.193s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.148s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Checking if we can resize image /opt/stack/data/nova/instances/d7c93039-48e6-40b8-b921-d6eb1ebe78ef/disk. size=1073741824 {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 22:10:46 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7c93039-48e6-40b8-b921-d6eb1ebe78ef/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG nova.compute.manager [req-9292921e-cc30-4d5b-82c2-d5a5a533cac8 req-c0c0f9a6-8d0d-4e2e-af64-4c23fd0d806f service nova] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Received event network-vif-unplugged-4de27111-4afc-4732-88b3-2485c4f254e8 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-9292921e-cc30-4d5b-82c2-d5a5a533cac8 req-c0c0f9a6-8d0d-4e2e-af64-4c23fd0d806f service nova] Acquiring lock "c4fc7798-567a-4002-b056-6c4f02d0e955-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-9292921e-cc30-4d5b-82c2-d5a5a533cac8 req-c0c0f9a6-8d0d-4e2e-af64-4c23fd0d806f service nova] Lock "c4fc7798-567a-4002-b056-6c4f02d0e955-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-9292921e-cc30-4d5b-82c2-d5a5a533cac8 req-c0c0f9a6-8d0d-4e2e-af64-4c23fd0d806f service nova] Lock "c4fc7798-567a-4002-b056-6c4f02d0e955-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG nova.compute.manager [req-9292921e-cc30-4d5b-82c2-d5a5a533cac8 req-c0c0f9a6-8d0d-4e2e-af64-4c23fd0d806f service nova] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] No waiting events found dispatching network-vif-unplugged-4de27111-4afc-4732-88b3-2485c4f254e8 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG nova.compute.manager [req-9292921e-cc30-4d5b-82c2-d5a5a533cac8 req-c0c0f9a6-8d0d-4e2e-af64-4c23fd0d806f service nova] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Received event network-vif-unplugged-4de27111-4afc-4732-88b3-2485c4f254e8 for instance with task_state deleting. {{(pid=71972) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7c93039-48e6-40b8-b921-d6eb1ebe78ef/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Cannot resize image /opt/stack/data/nova/instances/d7c93039-48e6-40b8-b921-d6eb1ebe78ef/disk to a smaller size. {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG nova.objects.instance [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lazy-loading 'migration_context' on Instance uuid d7c93039-48e6-40b8-b921-d6eb1ebe78ef {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Created local disks {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Ensure instance console log exists: /opt/stack/data/nova/instances/d7c93039-48e6-40b8-b921-d6eb1ebe78ef/console.log {{(pid=71972) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG nova.network.neutron [req-83c4eecc-9ba9-4254-b8d4-6457dc971d60 req-4edcc5c6-ce04-44c2-a83e-2d2c0d64c814 service nova] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Updated VIF entry in instance network info cache for port 7557dbee-f2e2-47a0-88eb-1377350f8504. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG nova.network.neutron [req-83c4eecc-9ba9-4254-b8d4-6457dc971d60 req-4edcc5c6-ce04-44c2-a83e-2d2c0d64c814 service nova] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Updating instance_info_cache with network_info: [{"id": "7557dbee-f2e2-47a0-88eb-1377350f8504", "address": "fa:16:3e:55:dc:d0", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.127", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7557dbee-f2", "ovs_interfaceid": "7557dbee-f2e2-47a0-88eb-1377350f8504", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-83c4eecc-9ba9-4254-b8d4-6457dc971d60 req-4edcc5c6-ce04-44c2-a83e-2d2c0d64c814 service nova] Releasing lock "refresh_cache-3681b009-1a99-4eb8-b189-3fe0647f5d1d" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:10:47 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Instance destroyed successfully. Apr 17 22:10:47 user nova-compute[71972]: DEBUG nova.objects.instance [None req-3ba7f96d-ad1b-425d-a010-93c36f3d540b tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Lazy-loading 'resources' on Instance uuid c4fc7798-567a-4002-b056-6c4f02d0e955 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-3ba7f96d-ad1b-425d-a010-93c36f3d540b tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:08:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1454701288',display_name='tempest-DeleteServersTestJSON-server-1454701288',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-deleteserverstestjson-server-1454701288',id=4,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-17T22:09:14Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='41c0b4d04b1b425db64e6ff5066f1dbe',ramdisk_id='',reservation_id='r-54xld834',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-DeleteServersTestJSON-894860321',owner_user_name='tempest-DeleteServersTestJSON-894860321-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T22:09:15Z,user_data=None,user_id='9ab44d4339554bfba6ac66bebad74413',uuid=c4fc7798-567a-4002-b056-6c4f02d0e955,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4de27111-4afc-4732-88b3-2485c4f254e8", "address": "fa:16:3e:4e:c3:49", "network": {"id": "c2d58e09-cb04-4c46-8176-ef70f3e76aa2", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-75580352-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "41c0b4d04b1b425db64e6ff5066f1dbe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de27111-4a", "ovs_interfaceid": "4de27111-4afc-4732-88b3-2485c4f254e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-3ba7f96d-ad1b-425d-a010-93c36f3d540b tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Converting VIF {"id": "4de27111-4afc-4732-88b3-2485c4f254e8", "address": "fa:16:3e:4e:c3:49", "network": {"id": "c2d58e09-cb04-4c46-8176-ef70f3e76aa2", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-75580352-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "41c0b4d04b1b425db64e6ff5066f1dbe", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4de27111-4a", "ovs_interfaceid": "4de27111-4afc-4732-88b3-2485c4f254e8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-3ba7f96d-ad1b-425d-a010-93c36f3d540b tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:c3:49,bridge_name='br-int',has_traffic_filtering=True,id=4de27111-4afc-4732-88b3-2485c4f254e8,network=Network(c2d58e09-cb04-4c46-8176-ef70f3e76aa2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4de27111-4a') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG os_vif [None req-3ba7f96d-ad1b-425d-a010-93c36f3d540b tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:c3:49,bridge_name='br-int',has_traffic_filtering=True,id=4de27111-4afc-4732-88b3-2485c4f254e8,network=Network(c2d58e09-cb04-4c46-8176-ef70f3e76aa2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4de27111-4a') {{(pid=71972) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4de27111-4a, bridge=br-int, if_exists=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:10:47 user nova-compute[71972]: INFO os_vif [None req-3ba7f96d-ad1b-425d-a010-93c36f3d540b tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:c3:49,bridge_name='br-int',has_traffic_filtering=True,id=4de27111-4afc-4732-88b3-2485c4f254e8,network=Network(c2d58e09-cb04-4c46-8176-ef70f3e76aa2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4de27111-4a') Apr 17 22:10:47 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-3ba7f96d-ad1b-425d-a010-93c36f3d540b tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Deleting instance files /opt/stack/data/nova/instances/c4fc7798-567a-4002-b056-6c4f02d0e955_del Apr 17 22:10:47 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-3ba7f96d-ad1b-425d-a010-93c36f3d540b tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Deletion of /opt/stack/data/nova/instances/c4fc7798-567a-4002-b056-6c4f02d0e955_del complete Apr 17 22:10:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-3ba7f96d-ad1b-425d-a010-93c36f3d540b tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Checking UEFI support for host arch (x86_64) {{(pid=71972) supports_uefi /opt/stack/nova/nova/virt/libvirt/host.py:1722}} Apr 17 22:10:47 user nova-compute[71972]: INFO nova.virt.libvirt.host [None req-3ba7f96d-ad1b-425d-a010-93c36f3d540b tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] UEFI support detected Apr 17 22:10:47 user nova-compute[71972]: INFO nova.compute.manager [None req-3ba7f96d-ad1b-425d-a010-93c36f3d540b tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Took 0.92 seconds to destroy the instance on the hypervisor. Apr 17 22:10:47 user nova-compute[71972]: DEBUG oslo.service.loopingcall [None req-3ba7f96d-ad1b-425d-a010-93c36f3d540b tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71972) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG nova.compute.manager [-] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Deallocating network for instance {{(pid=71972) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] deallocate_for_instance() {{(pid=71972) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG nova.network.neutron [req-9b0b70d3-0bcf-41bd-aca0-457f36a7559f req-c23aea8b-384a-4e03-852a-55e4a48031a5 service nova] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Updated VIF entry in instance network info cache for port aede8066-45b3-4414-98a0-50dda5a4ee66. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG nova.network.neutron [req-9b0b70d3-0bcf-41bd-aca0-457f36a7559f req-c23aea8b-384a-4e03-852a-55e4a48031a5 service nova] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Updating instance_info_cache with network_info: [{"id": "aede8066-45b3-4414-98a0-50dda5a4ee66", "address": "fa:16:3e:be:3c:02", "network": {"id": "966f14e1-bfad-4b86-921c-0f8f5ad29a5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1196755439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.137", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b99089f4e3074ee7a5c1ada03ceb8984", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaede8066-45", "ovs_interfaceid": "aede8066-45b3-4414-98a0-50dda5a4ee66", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-9b0b70d3-0bcf-41bd-aca0-457f36a7559f req-c23aea8b-384a-4e03-852a-55e4a48031a5 service nova] Releasing lock "refresh_cache-99cb7131-abb8-41d6-bddd-c3bc943b7678" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG nova.network.neutron [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Successfully created port: b377f91d-95cf-42f6-8cb9-62aa8d68bcb1 {{(pid=71972) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3d5546d2-bf92-4143-a792-f5b48c67ad38 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquiring lock "3681b009-1a99-4eb8-b189-3fe0647f5d1d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3d5546d2-bf92-4143-a792-f5b48c67ad38 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "3681b009-1a99-4eb8-b189-3fe0647f5d1d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3d5546d2-bf92-4143-a792-f5b48c67ad38 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquiring lock "3681b009-1a99-4eb8-b189-3fe0647f5d1d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3d5546d2-bf92-4143-a792-f5b48c67ad38 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "3681b009-1a99-4eb8-b189-3fe0647f5d1d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3d5546d2-bf92-4143-a792-f5b48c67ad38 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "3681b009-1a99-4eb8-b189-3fe0647f5d1d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:47 user nova-compute[71972]: INFO nova.compute.manager [None req-3d5546d2-bf92-4143-a792-f5b48c67ad38 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Terminating instance Apr 17 22:10:47 user nova-compute[71972]: DEBUG nova.compute.manager [None req-3d5546d2-bf92-4143-a792-f5b48c67ad38 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Start destroying the instance on the hypervisor. {{(pid=71972) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:48 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:10:48 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Took 0.64 seconds to deallocate network for instance. Apr 17 22:10:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3ba7f96d-ad1b-425d-a010-93c36f3d540b tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3ba7f96d-ad1b-425d-a010-93c36f3d540b tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:48 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Instance destroyed successfully. Apr 17 22:10:48 user nova-compute[71972]: DEBUG nova.objects.instance [None req-3d5546d2-bf92-4143-a792-f5b48c67ad38 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lazy-loading 'resources' on Instance uuid 3681b009-1a99-4eb8-b189-3fe0647f5d1d {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:10:48 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-3d5546d2-bf92-4143-a792-f5b48c67ad38 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:08:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1530040262',display_name='tempest-AttachVolumeNegativeTest-server-1530040262',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-1530040262',id=2,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCjQzi1kjZQeEQ1rYjOKbLcVcDCRQQIijtqR97gxYJ2Onb6dZq9Ac7P5Uos+0FwBhyMNkY6cGIDdtzKuXupShf31TiuVlUPJpGQ92/3ShzPqtsJ2m3VXUdv5ryHzD1eLpg==',key_name='tempest-keypair-1969874175',keypairs=,launch_index=0,launched_at=2023-04-17T22:09:14Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='e52724ed9bc54905bd5eddd8504e4c77',ramdisk_id='',reservation_id='r-umipdb5q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-1678965362',owner_user_name='tempest-AttachVolumeNegativeTest-1678965362-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T22:09:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='52b3e35c03b54ae4b5dabfb1325886a9',uuid=3681b009-1a99-4eb8-b189-3fe0647f5d1d,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7557dbee-f2e2-47a0-88eb-1377350f8504", "address": "fa:16:3e:55:dc:d0", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.127", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7557dbee-f2", "ovs_interfaceid": "7557dbee-f2e2-47a0-88eb-1377350f8504", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 22:10:48 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-3d5546d2-bf92-4143-a792-f5b48c67ad38 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Converting VIF {"id": "7557dbee-f2e2-47a0-88eb-1377350f8504", "address": "fa:16:3e:55:dc:d0", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.127", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7557dbee-f2", "ovs_interfaceid": "7557dbee-f2e2-47a0-88eb-1377350f8504", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:10:48 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-3d5546d2-bf92-4143-a792-f5b48c67ad38 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:55:dc:d0,bridge_name='br-int',has_traffic_filtering=True,id=7557dbee-f2e2-47a0-88eb-1377350f8504,network=Network(bd9d53b7-0e22-42c4-8b91-49575cadf44f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7557dbee-f2') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:10:48 user nova-compute[71972]: DEBUG os_vif [None req-3d5546d2-bf92-4143-a792-f5b48c67ad38 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:55:dc:d0,bridge_name='br-int',has_traffic_filtering=True,id=7557dbee-f2e2-47a0-88eb-1377350f8504,network=Network(bd9d53b7-0e22-42c4-8b91-49575cadf44f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7557dbee-f2') {{(pid=71972) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 22:10:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7557dbee-f2, bridge=br-int, if_exists=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:10:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:10:48 user nova-compute[71972]: INFO os_vif [None req-3d5546d2-bf92-4143-a792-f5b48c67ad38 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:55:dc:d0,bridge_name='br-int',has_traffic_filtering=True,id=7557dbee-f2e2-47a0-88eb-1377350f8504,network=Network(bd9d53b7-0e22-42c4-8b91-49575cadf44f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7557dbee-f2') Apr 17 22:10:48 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-3d5546d2-bf92-4143-a792-f5b48c67ad38 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Deleting instance files /opt/stack/data/nova/instances/3681b009-1a99-4eb8-b189-3fe0647f5d1d_del Apr 17 22:10:48 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-3d5546d2-bf92-4143-a792-f5b48c67ad38 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Deletion of /opt/stack/data/nova/instances/3681b009-1a99-4eb8-b189-3fe0647f5d1d_del complete Apr 17 22:10:48 user nova-compute[71972]: INFO nova.compute.manager [None req-3d5546d2-bf92-4143-a792-f5b48c67ad38 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Took 0.69 seconds to destroy the instance on the hypervisor. Apr 17 22:10:48 user nova-compute[71972]: DEBUG oslo.service.loopingcall [None req-3d5546d2-bf92-4143-a792-f5b48c67ad38 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71972) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 22:10:48 user nova-compute[71972]: DEBUG nova.compute.manager [-] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Deallocating network for instance {{(pid=71972) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 22:10:48 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] deallocate_for_instance() {{(pid=71972) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 22:10:48 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-3ba7f96d-ad1b-425d-a010-93c36f3d540b tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:10:48 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-3ba7f96d-ad1b-425d-a010-93c36f3d540b tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:10:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3ba7f96d-ad1b-425d-a010-93c36f3d540b tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.406s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:48 user nova-compute[71972]: INFO nova.scheduler.client.report [None req-3ba7f96d-ad1b-425d-a010-93c36f3d540b tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Deleted allocations for instance c4fc7798-567a-4002-b056-6c4f02d0e955 Apr 17 22:10:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3ba7f96d-ad1b-425d-a010-93c36f3d540b tempest-DeleteServersTestJSON-894860321 tempest-DeleteServersTestJSON-894860321-project-member] Lock "c4fc7798-567a-4002-b056-6c4f02d0e955" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.212s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:48 user nova-compute[71972]: DEBUG nova.compute.manager [req-4d2797ec-8cb3-4878-8d98-0eda9469d3d7 req-e9437638-b866-4318-ac85-1aaec1d2c90b service nova] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Received event network-vif-unplugged-7557dbee-f2e2-47a0-88eb-1377350f8504 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:10:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-4d2797ec-8cb3-4878-8d98-0eda9469d3d7 req-e9437638-b866-4318-ac85-1aaec1d2c90b service nova] Acquiring lock "3681b009-1a99-4eb8-b189-3fe0647f5d1d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-4d2797ec-8cb3-4878-8d98-0eda9469d3d7 req-e9437638-b866-4318-ac85-1aaec1d2c90b service nova] Lock "3681b009-1a99-4eb8-b189-3fe0647f5d1d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-4d2797ec-8cb3-4878-8d98-0eda9469d3d7 req-e9437638-b866-4318-ac85-1aaec1d2c90b service nova] Lock "3681b009-1a99-4eb8-b189-3fe0647f5d1d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:48 user nova-compute[71972]: DEBUG nova.compute.manager [req-4d2797ec-8cb3-4878-8d98-0eda9469d3d7 req-e9437638-b866-4318-ac85-1aaec1d2c90b service nova] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] No waiting events found dispatching network-vif-unplugged-7557dbee-f2e2-47a0-88eb-1377350f8504 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:10:48 user nova-compute[71972]: DEBUG nova.compute.manager [req-4d2797ec-8cb3-4878-8d98-0eda9469d3d7 req-e9437638-b866-4318-ac85-1aaec1d2c90b service nova] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Received event network-vif-unplugged-7557dbee-f2e2-47a0-88eb-1377350f8504 for instance with task_state deleting. {{(pid=71972) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.network.neutron [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Successfully updated port: b377f91d-95cf-42f6-8cb9-62aa8d68bcb1 {{(pid=71972) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Acquiring lock "refresh_cache-d7c93039-48e6-40b8-b921-d6eb1ebe78ef" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Acquired lock "refresh_cache-d7c93039-48e6-40b8-b921-d6eb1ebe78ef" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.network.neutron [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Building network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.compute.manager [req-1de02810-cea7-4b95-8e44-34a4c01bc15c req-76c66c43-dba5-48fc-afa5-4b0857e59f6b service nova] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Received event network-vif-plugged-4de27111-4afc-4732-88b3-2485c4f254e8 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-1de02810-cea7-4b95-8e44-34a4c01bc15c req-76c66c43-dba5-48fc-afa5-4b0857e59f6b service nova] Acquiring lock "c4fc7798-567a-4002-b056-6c4f02d0e955-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-1de02810-cea7-4b95-8e44-34a4c01bc15c req-76c66c43-dba5-48fc-afa5-4b0857e59f6b service nova] Lock "c4fc7798-567a-4002-b056-6c4f02d0e955-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-1de02810-cea7-4b95-8e44-34a4c01bc15c req-76c66c43-dba5-48fc-afa5-4b0857e59f6b service nova] Lock "c4fc7798-567a-4002-b056-6c4f02d0e955-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.compute.manager [req-1de02810-cea7-4b95-8e44-34a4c01bc15c req-76c66c43-dba5-48fc-afa5-4b0857e59f6b service nova] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] No waiting events found dispatching network-vif-plugged-4de27111-4afc-4732-88b3-2485c4f254e8 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:10:49 user nova-compute[71972]: WARNING nova.compute.manager [req-1de02810-cea7-4b95-8e44-34a4c01bc15c req-76c66c43-dba5-48fc-afa5-4b0857e59f6b service nova] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Received unexpected event network-vif-plugged-4de27111-4afc-4732-88b3-2485c4f254e8 for instance with vm_state deleted and task_state None. Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.compute.manager [req-1de02810-cea7-4b95-8e44-34a4c01bc15c req-76c66c43-dba5-48fc-afa5-4b0857e59f6b service nova] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Received event network-vif-deleted-4de27111-4afc-4732-88b3-2485c4f254e8 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.network.neutron [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Instance cache missing network info. {{(pid=71972) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.network.neutron [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Updating instance_info_cache with network_info: [{"id": "b377f91d-95cf-42f6-8cb9-62aa8d68bcb1", "address": "fa:16:3e:6a:32:e4", "network": {"id": "3bbe159c-29cd-4095-9556-8169500b1716", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-840222337-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f21699c3400842d3a28e71b288a4aaff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb377f91d-95", "ovs_interfaceid": "b377f91d-95cf-42f6-8cb9-62aa8d68bcb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Releasing lock "refresh_cache-d7c93039-48e6-40b8-b921-d6eb1ebe78ef" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.compute.manager [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Instance network_info: |[{"id": "b377f91d-95cf-42f6-8cb9-62aa8d68bcb1", "address": "fa:16:3e:6a:32:e4", "network": {"id": "3bbe159c-29cd-4095-9556-8169500b1716", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-840222337-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f21699c3400842d3a28e71b288a4aaff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb377f91d-95", "ovs_interfaceid": "b377f91d-95cf-42f6-8cb9-62aa8d68bcb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Start _get_guest_xml network_info=[{"id": "b377f91d-95cf-42f6-8cb9-62aa8d68bcb1", "address": "fa:16:3e:6a:32:e4", "network": {"id": "3bbe159c-29cd-4095-9556-8169500b1716", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-840222337-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f21699c3400842d3a28e71b288a4aaff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb377f91d-95", "ovs_interfaceid": "b377f91d-95cf-42f6-8cb9-62aa8d68bcb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'image_id': '80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 22:10:49 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:10:49 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71972) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T22:06:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=), allow threads: True {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Flavor limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Image limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Flavor pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Image pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Got 1 possible topologies {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:10:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-54695480',display_name='tempest-ServersNegativeTestJSON-server-54695480',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-54695480',id=12,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f21699c3400842d3a28e71b288a4aaff',ramdisk_id='',reservation_id='r-fmibn9o9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1844623378',owner_user_name='tempest-ServersNegativeTestJSON-1844623378-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:10:46Z,user_data=None,user_id='51c0b269c97241d9ad122b23af3ca7ea',uuid=d7c93039-48e6-40b8-b921-d6eb1ebe78ef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b377f91d-95cf-42f6-8cb9-62aa8d68bcb1", "address": "fa:16:3e:6a:32:e4", "network": {"id": "3bbe159c-29cd-4095-9556-8169500b1716", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-840222337-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f21699c3400842d3a28e71b288a4aaff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb377f91d-95", "ovs_interfaceid": "b377f91d-95cf-42f6-8cb9-62aa8d68bcb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71972) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Converting VIF {"id": "b377f91d-95cf-42f6-8cb9-62aa8d68bcb1", "address": "fa:16:3e:6a:32:e4", "network": {"id": "3bbe159c-29cd-4095-9556-8169500b1716", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-840222337-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f21699c3400842d3a28e71b288a4aaff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb377f91d-95", "ovs_interfaceid": "b377f91d-95cf-42f6-8cb9-62aa8d68bcb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:32:e4,bridge_name='br-int',has_traffic_filtering=True,id=b377f91d-95cf-42f6-8cb9-62aa8d68bcb1,network=Network(3bbe159c-29cd-4095-9556-8169500b1716),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb377f91d-95') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.objects.instance [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lazy-loading 'pci_devices' on Instance uuid d7c93039-48e6-40b8-b921-d6eb1ebe78ef {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] End _get_guest_xml xml= Apr 17 22:10:49 user nova-compute[71972]: d7c93039-48e6-40b8-b921-d6eb1ebe78ef Apr 17 22:10:49 user nova-compute[71972]: instance-0000000c Apr 17 22:10:49 user nova-compute[71972]: 131072 Apr 17 22:10:49 user nova-compute[71972]: 1 Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: tempest-ServersNegativeTestJSON-server-54695480 Apr 17 22:10:49 user nova-compute[71972]: 2023-04-17 22:10:49 Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: 128 Apr 17 22:10:49 user nova-compute[71972]: 1 Apr 17 22:10:49 user nova-compute[71972]: 0 Apr 17 22:10:49 user nova-compute[71972]: 0 Apr 17 22:10:49 user nova-compute[71972]: 1 Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: tempest-ServersNegativeTestJSON-1844623378-project-member Apr 17 22:10:49 user nova-compute[71972]: tempest-ServersNegativeTestJSON-1844623378 Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: OpenStack Foundation Apr 17 22:10:49 user nova-compute[71972]: OpenStack Nova Apr 17 22:10:49 user nova-compute[71972]: 0.0.0 Apr 17 22:10:49 user nova-compute[71972]: d7c93039-48e6-40b8-b921-d6eb1ebe78ef Apr 17 22:10:49 user nova-compute[71972]: d7c93039-48e6-40b8-b921-d6eb1ebe78ef Apr 17 22:10:49 user nova-compute[71972]: Virtual Machine Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: hvm Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Nehalem Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: /dev/urandom Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: Apr 17 22:10:49 user nova-compute[71972]: {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:10:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-54695480',display_name='tempest-ServersNegativeTestJSON-server-54695480',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-54695480',id=12,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f21699c3400842d3a28e71b288a4aaff',ramdisk_id='',reservation_id='r-fmibn9o9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServersNegativeTestJSON-1844623378',owner_user_name='tempest-ServersNegativeTestJSON-1844623378-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:10:46Z,user_data=None,user_id='51c0b269c97241d9ad122b23af3ca7ea',uuid=d7c93039-48e6-40b8-b921-d6eb1ebe78ef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b377f91d-95cf-42f6-8cb9-62aa8d68bcb1", "address": "fa:16:3e:6a:32:e4", "network": {"id": "3bbe159c-29cd-4095-9556-8169500b1716", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-840222337-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f21699c3400842d3a28e71b288a4aaff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb377f91d-95", "ovs_interfaceid": "b377f91d-95cf-42f6-8cb9-62aa8d68bcb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Converting VIF {"id": "b377f91d-95cf-42f6-8cb9-62aa8d68bcb1", "address": "fa:16:3e:6a:32:e4", "network": {"id": "3bbe159c-29cd-4095-9556-8169500b1716", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-840222337-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f21699c3400842d3a28e71b288a4aaff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb377f91d-95", "ovs_interfaceid": "b377f91d-95cf-42f6-8cb9-62aa8d68bcb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:32:e4,bridge_name='br-int',has_traffic_filtering=True,id=b377f91d-95cf-42f6-8cb9-62aa8d68bcb1,network=Network(3bbe159c-29cd-4095-9556-8169500b1716),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb377f91d-95') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG os_vif [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:32:e4,bridge_name='br-int',has_traffic_filtering=True,id=b377f91d-95cf-42f6-8cb9-62aa8d68bcb1,network=Network(3bbe159c-29cd-4095-9556-8169500b1716),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb377f91d-95') {{(pid=71972) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb377f91d-95, may_exist=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb377f91d-95, col_values=(('external_ids', {'iface-id': 'b377f91d-95cf-42f6-8cb9-62aa8d68bcb1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6a:32:e4', 'vm-uuid': 'd7c93039-48e6-40b8-b921-d6eb1ebe78ef'}),)) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:49 user nova-compute[71972]: INFO os_vif [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:32:e4,bridge_name='br-int',has_traffic_filtering=True,id=b377f91d-95cf-42f6-8cb9-62aa8d68bcb1,network=Network(3bbe159c-29cd-4095-9556-8169500b1716),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb377f91d-95') Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] No BDM found with device name vda, not building metadata. {{(pid=71972) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] No VIF found with MAC fa:16:3e:6a:32:e4, not building metadata {{(pid=71972) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:10:49 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Took 1.17 seconds to deallocate network for instance. Apr 17 22:10:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3d5546d2-bf92-4143-a792-f5b48c67ad38 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3d5546d2-bf92-4143-a792-f5b48c67ad38 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:50 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-3d5546d2-bf92-4143-a792-f5b48c67ad38 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:10:50 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-3d5546d2-bf92-4143-a792-f5b48c67ad38 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:10:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3d5546d2-bf92-4143-a792-f5b48c67ad38 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.357s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:50 user nova-compute[71972]: INFO nova.scheduler.client.report [None req-3d5546d2-bf92-4143-a792-f5b48c67ad38 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Deleted allocations for instance 3681b009-1a99-4eb8-b189-3fe0647f5d1d Apr 17 22:10:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3d5546d2-bf92-4143-a792-f5b48c67ad38 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "3681b009-1a99-4eb8-b189-3fe0647f5d1d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.420s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:50 user nova-compute[71972]: INFO nova.compute.manager [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Rescuing Apr 17 22:10:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Acquiring lock "refresh_cache-97067629-e099-49fd-bb79-223dd4401405" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:10:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Acquired lock "refresh_cache-97067629-e099-49fd-bb79-223dd4401405" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:10:50 user nova-compute[71972]: DEBUG nova.network.neutron [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Building network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 22:10:50 user nova-compute[71972]: DEBUG nova.network.neutron [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Updating instance_info_cache with network_info: [{"id": "63dc9a41-e89e-4673-a658-7acddd88706f", "address": "fa:16:3e:3e:d4:f0", "network": {"id": "bd20802f-022c-4a11-8310-49b2375f642e", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-172869997-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a863c30ce3844f0ba754b048c2833fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap63dc9a41-e8", "ovs_interfaceid": "63dc9a41-e89e-4673-a658-7acddd88706f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:10:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Releasing lock "refresh_cache-97067629-e099-49fd-bb79-223dd4401405" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:10:50 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:50 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:50 user nova-compute[71972]: DEBUG nova.compute.manager [req-478920c6-a1e6-42b6-8c70-30be76369c96 req-e8081bdd-a548-452e-94cb-1f4d8b554e8f service nova] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Received event network-vif-plugged-7557dbee-f2e2-47a0-88eb-1377350f8504 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:10:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-478920c6-a1e6-42b6-8c70-30be76369c96 req-e8081bdd-a548-452e-94cb-1f4d8b554e8f service nova] Acquiring lock "3681b009-1a99-4eb8-b189-3fe0647f5d1d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-478920c6-a1e6-42b6-8c70-30be76369c96 req-e8081bdd-a548-452e-94cb-1f4d8b554e8f service nova] Lock "3681b009-1a99-4eb8-b189-3fe0647f5d1d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-478920c6-a1e6-42b6-8c70-30be76369c96 req-e8081bdd-a548-452e-94cb-1f4d8b554e8f service nova] Lock "3681b009-1a99-4eb8-b189-3fe0647f5d1d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:50 user nova-compute[71972]: DEBUG nova.compute.manager [req-478920c6-a1e6-42b6-8c70-30be76369c96 req-e8081bdd-a548-452e-94cb-1f4d8b554e8f service nova] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] No waiting events found dispatching network-vif-plugged-7557dbee-f2e2-47a0-88eb-1377350f8504 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:10:50 user nova-compute[71972]: WARNING nova.compute.manager [req-478920c6-a1e6-42b6-8c70-30be76369c96 req-e8081bdd-a548-452e-94cb-1f4d8b554e8f service nova] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Received unexpected event network-vif-plugged-7557dbee-f2e2-47a0-88eb-1377350f8504 for instance with vm_state deleted and task_state None. Apr 17 22:10:50 user nova-compute[71972]: DEBUG nova.compute.manager [req-478920c6-a1e6-42b6-8c70-30be76369c96 req-e8081bdd-a548-452e-94cb-1f4d8b554e8f service nova] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Received event network-changed-b377f91d-95cf-42f6-8cb9-62aa8d68bcb1 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:10:50 user nova-compute[71972]: DEBUG nova.compute.manager [req-478920c6-a1e6-42b6-8c70-30be76369c96 req-e8081bdd-a548-452e-94cb-1f4d8b554e8f service nova] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Refreshing instance network info cache due to event network-changed-b377f91d-95cf-42f6-8cb9-62aa8d68bcb1. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:10:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-478920c6-a1e6-42b6-8c70-30be76369c96 req-e8081bdd-a548-452e-94cb-1f4d8b554e8f service nova] Acquiring lock "refresh_cache-d7c93039-48e6-40b8-b921-d6eb1ebe78ef" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:10:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-478920c6-a1e6-42b6-8c70-30be76369c96 req-e8081bdd-a548-452e-94cb-1f4d8b554e8f service nova] Acquired lock "refresh_cache-d7c93039-48e6-40b8-b921-d6eb1ebe78ef" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:10:50 user nova-compute[71972]: DEBUG nova.network.neutron [req-478920c6-a1e6-42b6-8c70-30be76369c96 req-e8081bdd-a548-452e-94cb-1f4d8b554e8f service nova] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Refreshing network info cache for port b377f91d-95cf-42f6-8cb9-62aa8d68bcb1 {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:10:50 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:50 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:50 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.compute.manager [req-75085479-e2c6-4345-af7e-71abb5ed21bc req-a973cc33-fad7-4fd5-a16f-4941981c4a8c service nova] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Received event network-vif-plugged-b377f91d-95cf-42f6-8cb9-62aa8d68bcb1 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-75085479-e2c6-4345-af7e-71abb5ed21bc req-a973cc33-fad7-4fd5-a16f-4941981c4a8c service nova] Acquiring lock "d7c93039-48e6-40b8-b921-d6eb1ebe78ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-75085479-e2c6-4345-af7e-71abb5ed21bc req-a973cc33-fad7-4fd5-a16f-4941981c4a8c service nova] Lock "d7c93039-48e6-40b8-b921-d6eb1ebe78ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-75085479-e2c6-4345-af7e-71abb5ed21bc req-a973cc33-fad7-4fd5-a16f-4941981c4a8c service nova] Lock "d7c93039-48e6-40b8-b921-d6eb1ebe78ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.compute.manager [req-75085479-e2c6-4345-af7e-71abb5ed21bc req-a973cc33-fad7-4fd5-a16f-4941981c4a8c service nova] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] No waiting events found dispatching network-vif-plugged-b377f91d-95cf-42f6-8cb9-62aa8d68bcb1 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:10:51 user nova-compute[71972]: WARNING nova.compute.manager [req-75085479-e2c6-4345-af7e-71abb5ed21bc req-a973cc33-fad7-4fd5-a16f-4941981c4a8c service nova] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Received unexpected event network-vif-plugged-b377f91d-95cf-42f6-8cb9-62aa8d68bcb1 for instance with vm_state building and task_state spawning. Apr 17 22:10:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.network.neutron [req-478920c6-a1e6-42b6-8c70-30be76369c96 req-e8081bdd-a548-452e-94cb-1f4d8b554e8f service nova] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Updated VIF entry in instance network info cache for port b377f91d-95cf-42f6-8cb9-62aa8d68bcb1. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.network.neutron [req-478920c6-a1e6-42b6-8c70-30be76369c96 req-e8081bdd-a548-452e-94cb-1f4d8b554e8f service nova] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Updating instance_info_cache with network_info: [{"id": "b377f91d-95cf-42f6-8cb9-62aa8d68bcb1", "address": "fa:16:3e:6a:32:e4", "network": {"id": "3bbe159c-29cd-4095-9556-8169500b1716", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-840222337-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f21699c3400842d3a28e71b288a4aaff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb377f91d-95", "ovs_interfaceid": "b377f91d-95cf-42f6-8cb9-62aa8d68bcb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-478920c6-a1e6-42b6-8c70-30be76369c96 req-e8081bdd-a548-452e-94cb-1f4d8b554e8f service nova] Releasing lock "refresh_cache-d7c93039-48e6-40b8-b921-d6eb1ebe78ef" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.compute.manager [req-c9d613fa-0580-485c-ab41-68b4cc8d6631 req-b97809f8-76f2-4f44-8257-92e36a67fb9b service nova] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Received event network-vif-deleted-7557dbee-f2e2-47a0-88eb-1377350f8504 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:10:51 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: 97067629-e099-49fd-bb79-223dd4401405] Instance destroyed successfully. Apr 17 22:10:51 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Attempting rescue Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} {{(pid=71972) rescue /opt/stack/nova/nova/virt/libvirt/driver.py:4289}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Instance directory exists: not creating {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4694}} Apr 17 22:10:51 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Creating image(s) Apr 17 22:10:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Acquiring lock "/opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "/opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "/opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.objects.instance [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lazy-loading 'trusted_certs' on Instance uuid 97067629-e099-49fd-bb79-223dd4401405 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Acquiring lock "fde46b8e739fd6213d1525690f3ccf27384ee720" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.132s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk.rescue {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk.rescue" returned: 0 in 0.044s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.182s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.objects.instance [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lazy-loading 'migration_context' on Instance uuid 97067629-e099-49fd-bb79-223dd4401405 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Created local disks {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Start _get_guest_xml network_info=[{"id": "63dc9a41-e89e-4673-a658-7acddd88706f", "address": "fa:16:3e:3e:d4:f0", "network": {"id": "bd20802f-022c-4a11-8310-49b2375f642e", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-172869997-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-172869997-network", "vif_mac": "fa:16:3e:3e:d4:f0"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a863c30ce3844f0ba754b048c2833fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap63dc9a41-e8", "ovs_interfaceid": "63dc9a41-e89e-4673-a658-7acddd88706f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=) rescue={'image_id': '80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.objects.instance [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lazy-loading 'resources' on Instance uuid 97067629-e099-49fd-bb79-223dd4401405 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.objects.instance [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lazy-loading 'numa_topology' on Instance uuid 97067629-e099-49fd-bb79-223dd4401405 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:10:51 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:10:51 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71972) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T22:06:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=), allow threads: True {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Flavor limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Image limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Flavor pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Image pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Got 1 possible topologies {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.objects.instance [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lazy-loading 'vcpu_model' on Instance uuid 97067629-e099-49fd-bb79-223dd4401405 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:08:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1698566818',display_name='tempest-ServerRescueNegativeTestJSON-server-1698566818',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1698566818',id=6,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-17T22:09:14Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='a863c30ce3844f0ba754b048c2833fa3',ramdisk_id='',reservation_id='r-x70r9ud0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerRescueNegativeTestJSON-2008986942',owner_user_name='tempest-ServerRescueNegativeTestJSON-2008986942-project-member'},tags=,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:09:15Z,user_data=None,user_id='b3e3003057e7456c933b762412442a3e',uuid=97067629-e099-49fd-bb79-223dd4401405,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "63dc9a41-e89e-4673-a658-7acddd88706f", "address": "fa:16:3e:3e:d4:f0", "network": {"id": "bd20802f-022c-4a11-8310-49b2375f642e", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-172869997-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-172869997-network", "vif_mac": "fa:16:3e:3e:d4:f0"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a863c30ce3844f0ba754b048c2833fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap63dc9a41-e8", "ovs_interfaceid": "63dc9a41-e89e-4673-a658-7acddd88706f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71972) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Converting VIF {"id": "63dc9a41-e89e-4673-a658-7acddd88706f", "address": "fa:16:3e:3e:d4:f0", "network": {"id": "bd20802f-022c-4a11-8310-49b2375f642e", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-172869997-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-172869997-network", "vif_mac": "fa:16:3e:3e:d4:f0"}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a863c30ce3844f0ba754b048c2833fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap63dc9a41-e8", "ovs_interfaceid": "63dc9a41-e89e-4673-a658-7acddd88706f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3e:d4:f0,bridge_name='br-int',has_traffic_filtering=True,id=63dc9a41-e89e-4673-a658-7acddd88706f,network=Network(bd20802f-022c-4a11-8310-49b2375f642e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63dc9a41-e8') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.objects.instance [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lazy-loading 'pci_devices' on Instance uuid 97067629-e099-49fd-bb79-223dd4401405 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] End _get_guest_xml xml= Apr 17 22:10:51 user nova-compute[71972]: 97067629-e099-49fd-bb79-223dd4401405 Apr 17 22:10:51 user nova-compute[71972]: instance-00000006 Apr 17 22:10:51 user nova-compute[71972]: 131072 Apr 17 22:10:51 user nova-compute[71972]: 1 Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: tempest-ServerRescueNegativeTestJSON-server-1698566818 Apr 17 22:10:51 user nova-compute[71972]: 2023-04-17 22:10:51 Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: 128 Apr 17 22:10:51 user nova-compute[71972]: 1 Apr 17 22:10:51 user nova-compute[71972]: 0 Apr 17 22:10:51 user nova-compute[71972]: 0 Apr 17 22:10:51 user nova-compute[71972]: 1 Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: tempest-ServerRescueNegativeTestJSON-2008986942-project-member Apr 17 22:10:51 user nova-compute[71972]: tempest-ServerRescueNegativeTestJSON-2008986942 Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: OpenStack Foundation Apr 17 22:10:51 user nova-compute[71972]: OpenStack Nova Apr 17 22:10:51 user nova-compute[71972]: 0.0.0 Apr 17 22:10:51 user nova-compute[71972]: 97067629-e099-49fd-bb79-223dd4401405 Apr 17 22:10:51 user nova-compute[71972]: 97067629-e099-49fd-bb79-223dd4401405 Apr 17 22:10:51 user nova-compute[71972]: Virtual Machine Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: hvm Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Nehalem Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: /dev/urandom Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: Apr 17 22:10:51 user nova-compute[71972]: {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 22:10:51 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: 97067629-e099-49fd-bb79-223dd4401405] Instance destroyed successfully. Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] No BDM found with device name vda, not building metadata. {{(pid=71972) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] No BDM found with device name vdb, not building metadata. {{(pid=71972) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 22:10:51 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] No VIF found with MAC fa:16:3e:3e:d4:f0, not building metadata {{(pid=71972) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 22:10:52 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG nova.compute.manager [req-db7e5d56-3857-4118-9910-8b36ffcd8054 req-a5d219ae-3f17-47fa-bca1-27fbec5a0bc0 service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] Received event network-vif-unplugged-63dc9a41-e89e-4673-a658-7acddd88706f {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-db7e5d56-3857-4118-9910-8b36ffcd8054 req-a5d219ae-3f17-47fa-bca1-27fbec5a0bc0 service nova] Acquiring lock "97067629-e099-49fd-bb79-223dd4401405-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-db7e5d56-3857-4118-9910-8b36ffcd8054 req-a5d219ae-3f17-47fa-bca1-27fbec5a0bc0 service nova] Lock "97067629-e099-49fd-bb79-223dd4401405-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-db7e5d56-3857-4118-9910-8b36ffcd8054 req-a5d219ae-3f17-47fa-bca1-27fbec5a0bc0 service nova] Lock "97067629-e099-49fd-bb79-223dd4401405-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG nova.compute.manager [req-db7e5d56-3857-4118-9910-8b36ffcd8054 req-a5d219ae-3f17-47fa-bca1-27fbec5a0bc0 service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] No waiting events found dispatching network-vif-unplugged-63dc9a41-e89e-4673-a658-7acddd88706f {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:10:53 user nova-compute[71972]: WARNING nova.compute.manager [req-db7e5d56-3857-4118-9910-8b36ffcd8054 req-a5d219ae-3f17-47fa-bca1-27fbec5a0bc0 service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] Received unexpected event network-vif-unplugged-63dc9a41-e89e-4673-a658-7acddd88706f for instance with vm_state active and task_state rescuing. Apr 17 22:10:53 user nova-compute[71972]: DEBUG nova.compute.manager [req-db7e5d56-3857-4118-9910-8b36ffcd8054 req-a5d219ae-3f17-47fa-bca1-27fbec5a0bc0 service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] Received event network-vif-plugged-63dc9a41-e89e-4673-a658-7acddd88706f {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-db7e5d56-3857-4118-9910-8b36ffcd8054 req-a5d219ae-3f17-47fa-bca1-27fbec5a0bc0 service nova] Acquiring lock "97067629-e099-49fd-bb79-223dd4401405-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-db7e5d56-3857-4118-9910-8b36ffcd8054 req-a5d219ae-3f17-47fa-bca1-27fbec5a0bc0 service nova] Lock "97067629-e099-49fd-bb79-223dd4401405-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-db7e5d56-3857-4118-9910-8b36ffcd8054 req-a5d219ae-3f17-47fa-bca1-27fbec5a0bc0 service nova] Lock "97067629-e099-49fd-bb79-223dd4401405-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG nova.compute.manager [req-db7e5d56-3857-4118-9910-8b36ffcd8054 req-a5d219ae-3f17-47fa-bca1-27fbec5a0bc0 service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] No waiting events found dispatching network-vif-plugged-63dc9a41-e89e-4673-a658-7acddd88706f {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:10:53 user nova-compute[71972]: WARNING nova.compute.manager [req-db7e5d56-3857-4118-9910-8b36ffcd8054 req-a5d219ae-3f17-47fa-bca1-27fbec5a0bc0 service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] Received unexpected event network-vif-plugged-63dc9a41-e89e-4673-a658-7acddd88706f for instance with vm_state active and task_state rescuing. Apr 17 22:10:53 user nova-compute[71972]: DEBUG nova.compute.manager [req-db7e5d56-3857-4118-9910-8b36ffcd8054 req-a5d219ae-3f17-47fa-bca1-27fbec5a0bc0 service nova] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Received event network-changed-11c4742a-b778-458f-9a76-1a8d6330f415 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG nova.compute.manager [req-db7e5d56-3857-4118-9910-8b36ffcd8054 req-a5d219ae-3f17-47fa-bca1-27fbec5a0bc0 service nova] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Refreshing instance network info cache due to event network-changed-11c4742a-b778-458f-9a76-1a8d6330f415. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-db7e5d56-3857-4118-9910-8b36ffcd8054 req-a5d219ae-3f17-47fa-bca1-27fbec5a0bc0 service nova] Acquiring lock "refresh_cache-cbaa3995-f00d-4194-b7e2-29bfc6e27614" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-db7e5d56-3857-4118-9910-8b36ffcd8054 req-a5d219ae-3f17-47fa-bca1-27fbec5a0bc0 service nova] Acquired lock "refresh_cache-cbaa3995-f00d-4194-b7e2-29bfc6e27614" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG nova.network.neutron [req-db7e5d56-3857-4118-9910-8b36ffcd8054 req-a5d219ae-3f17-47fa-bca1-27fbec5a0bc0 service nova] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Refreshing network info cache for port 11c4742a-b778-458f-9a76-1a8d6330f415 {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG nova.compute.manager [req-ca943131-684c-496c-8e45-a4038d7e6278 req-8315ae59-85f1-42e5-bd86-7e025844023d service nova] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Received event network-vif-plugged-b377f91d-95cf-42f6-8cb9-62aa8d68bcb1 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-ca943131-684c-496c-8e45-a4038d7e6278 req-8315ae59-85f1-42e5-bd86-7e025844023d service nova] Acquiring lock "d7c93039-48e6-40b8-b921-d6eb1ebe78ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-ca943131-684c-496c-8e45-a4038d7e6278 req-8315ae59-85f1-42e5-bd86-7e025844023d service nova] Lock "d7c93039-48e6-40b8-b921-d6eb1ebe78ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-ca943131-684c-496c-8e45-a4038d7e6278 req-8315ae59-85f1-42e5-bd86-7e025844023d service nova] Lock "d7c93039-48e6-40b8-b921-d6eb1ebe78ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG nova.compute.manager [req-ca943131-684c-496c-8e45-a4038d7e6278 req-8315ae59-85f1-42e5-bd86-7e025844023d service nova] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] No waiting events found dispatching network-vif-plugged-b377f91d-95cf-42f6-8cb9-62aa8d68bcb1 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:10:53 user nova-compute[71972]: WARNING nova.compute.manager [req-ca943131-684c-496c-8e45-a4038d7e6278 req-8315ae59-85f1-42e5-bd86-7e025844023d service nova] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Received unexpected event network-vif-plugged-b377f91d-95cf-42f6-8cb9-62aa8d68bcb1 for instance with vm_state building and task_state spawning. Apr 17 22:10:53 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Resumed> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:10:53 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] VM Resumed (Lifecycle Event) Apr 17 22:10:53 user nova-compute[71972]: DEBUG nova.compute.manager [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Instance event wait completed in 0 seconds for {{(pid=71972) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Guest created on hypervisor {{(pid=71972) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 22:10:53 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Instance spawned successfully. Apr 17 22:10:53 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Found default for hw_cdrom_bus of ide {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Found default for hw_disk_bus of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Found default for hw_input_bus of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Found default for hw_pointer_model of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Found default for hw_video_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Found default for hw_vif_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:10:53 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:10:53 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Started> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:10:53 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] VM Started (Lifecycle Event) Apr 17 22:10:53 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:10:53 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:10:53 user nova-compute[71972]: INFO nova.compute.manager [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Took 7.03 seconds to spawn the instance on the hypervisor. Apr 17 22:10:53 user nova-compute[71972]: DEBUG nova.compute.manager [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:53 user nova-compute[71972]: INFO nova.compute.manager [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Took 7.91 seconds to build instance. Apr 17 22:10:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5cdc29ec-46ee-4466-a4ce-131c698c918e tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "d7c93039-48e6-40b8-b921-d6eb1ebe78ef" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.008s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG nova.network.neutron [req-db7e5d56-3857-4118-9910-8b36ffcd8054 req-a5d219ae-3f17-47fa-bca1-27fbec5a0bc0 service nova] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Updated VIF entry in instance network info cache for port 11c4742a-b778-458f-9a76-1a8d6330f415. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG nova.network.neutron [req-db7e5d56-3857-4118-9910-8b36ffcd8054 req-a5d219ae-3f17-47fa-bca1-27fbec5a0bc0 service nova] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Updating instance_info_cache with network_info: [{"id": "11c4742a-b778-458f-9a76-1a8d6330f415", "address": "fa:16:3e:43:2f:41", "network": {"id": "2aae2552-3ef8-41d3-84e6-313da6fc203b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1626967335-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.37", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9283fe3c9a094f9bbddb08e48973da44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c4742a-b7", "ovs_interfaceid": "11c4742a-b778-458f-9a76-1a8d6330f415", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-db7e5d56-3857-4118-9910-8b36ffcd8054 req-a5d219ae-3f17-47fa-bca1-27fbec5a0bc0 service nova] Releasing lock "refresh_cache-cbaa3995-f00d-4194-b7e2-29bfc6e27614" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7baa820d-07d5-4349-9105-c176b30ad0fd tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Acquiring lock "cbaa3995-f00d-4194-b7e2-29bfc6e27614" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7baa820d-07d5-4349-9105-c176b30ad0fd tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "cbaa3995-f00d-4194-b7e2-29bfc6e27614" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7baa820d-07d5-4349-9105-c176b30ad0fd tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Acquiring lock "cbaa3995-f00d-4194-b7e2-29bfc6e27614-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7baa820d-07d5-4349-9105-c176b30ad0fd tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "cbaa3995-f00d-4194-b7e2-29bfc6e27614-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7baa820d-07d5-4349-9105-c176b30ad0fd tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "cbaa3995-f00d-4194-b7e2-29bfc6e27614-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:53 user nova-compute[71972]: INFO nova.compute.manager [None req-7baa820d-07d5-4349-9105-c176b30ad0fd tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Terminating instance Apr 17 22:10:54 user nova-compute[71972]: DEBUG nova.compute.manager [None req-7baa820d-07d5-4349-9105-c176b30ad0fd tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Start destroying the instance on the hypervisor. {{(pid=71972) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 22:10:54 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:54 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:54 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:54 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:54 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:54 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Instance destroyed successfully. Apr 17 22:10:54 user nova-compute[71972]: DEBUG nova.objects.instance [None req-7baa820d-07d5-4349-9105-c176b30ad0fd tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lazy-loading 'resources' on Instance uuid cbaa3995-f00d-4194-b7e2-29bfc6e27614 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:10:54 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-7baa820d-07d5-4349-9105-c176b30ad0fd tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:09:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1596520601',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-1596520601',id=7,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIUDw4oqZGwgHOlogieqsyzip0wW30jA5743bGk+uV3e5+U/R9yL7AS0GXvphtOuOOI4CmAsHcasbFw371H+sA0tPQYeuyAIlEJTYvj1WiabqMovvf1nFdGYtWBjTZP1Rw==',key_name='tempest-keypair-1290889891',keypairs=,launch_index=0,launched_at=2023-04-17T22:09:21Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='9283fe3c9a094f9bbddb08e48973da44',ramdisk_id='',reservation_id='r-pap560xw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeShelveTestJSON-1051644628',owner_user_name='tempest-AttachVolumeShelveTestJSON-1051644628-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T22:09:22Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a5ec05fe7d7244368c7eec3739a96c19',uuid=cbaa3995-f00d-4194-b7e2-29bfc6e27614,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "11c4742a-b778-458f-9a76-1a8d6330f415", "address": "fa:16:3e:43:2f:41", "network": {"id": "2aae2552-3ef8-41d3-84e6-313da6fc203b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1626967335-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.37", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9283fe3c9a094f9bbddb08e48973da44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c4742a-b7", "ovs_interfaceid": "11c4742a-b778-458f-9a76-1a8d6330f415", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 22:10:54 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-7baa820d-07d5-4349-9105-c176b30ad0fd tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Converting VIF {"id": "11c4742a-b778-458f-9a76-1a8d6330f415", "address": "fa:16:3e:43:2f:41", "network": {"id": "2aae2552-3ef8-41d3-84e6-313da6fc203b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1626967335-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.37", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9283fe3c9a094f9bbddb08e48973da44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap11c4742a-b7", "ovs_interfaceid": "11c4742a-b778-458f-9a76-1a8d6330f415", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:10:54 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-7baa820d-07d5-4349-9105-c176b30ad0fd tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:43:2f:41,bridge_name='br-int',has_traffic_filtering=True,id=11c4742a-b778-458f-9a76-1a8d6330f415,network=Network(2aae2552-3ef8-41d3-84e6-313da6fc203b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11c4742a-b7') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:10:54 user nova-compute[71972]: DEBUG os_vif [None req-7baa820d-07d5-4349-9105-c176b30ad0fd tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:43:2f:41,bridge_name='br-int',has_traffic_filtering=True,id=11c4742a-b778-458f-9a76-1a8d6330f415,network=Network(2aae2552-3ef8-41d3-84e6-313da6fc203b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11c4742a-b7') {{(pid=71972) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 22:10:54 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:54 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap11c4742a-b7, bridge=br-int, if_exists=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:10:54 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:54 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:54 user nova-compute[71972]: INFO os_vif [None req-7baa820d-07d5-4349-9105-c176b30ad0fd tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:43:2f:41,bridge_name='br-int',has_traffic_filtering=True,id=11c4742a-b778-458f-9a76-1a8d6330f415,network=Network(2aae2552-3ef8-41d3-84e6-313da6fc203b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap11c4742a-b7') Apr 17 22:10:54 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-7baa820d-07d5-4349-9105-c176b30ad0fd tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Deleting instance files /opt/stack/data/nova/instances/cbaa3995-f00d-4194-b7e2-29bfc6e27614_del Apr 17 22:10:54 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-7baa820d-07d5-4349-9105-c176b30ad0fd tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Deletion of /opt/stack/data/nova/instances/cbaa3995-f00d-4194-b7e2-29bfc6e27614_del complete Apr 17 22:10:54 user nova-compute[71972]: INFO nova.compute.manager [None req-7baa820d-07d5-4349-9105-c176b30ad0fd tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Took 0.66 seconds to destroy the instance on the hypervisor. Apr 17 22:10:54 user nova-compute[71972]: DEBUG oslo.service.loopingcall [None req-7baa820d-07d5-4349-9105-c176b30ad0fd tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71972) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 22:10:54 user nova-compute[71972]: DEBUG nova.compute.manager [-] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Deallocating network for instance {{(pid=71972) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 22:10:54 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] deallocate_for_instance() {{(pid=71972) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG nova.compute.manager [req-bbd8820a-8d83-45c3-b9f3-2471a0034d9e req-65a78248-1f45-492e-b263-3f8f98c2c447 service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] Received event network-vif-plugged-63dc9a41-e89e-4673-a658-7acddd88706f {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-bbd8820a-8d83-45c3-b9f3-2471a0034d9e req-65a78248-1f45-492e-b263-3f8f98c2c447 service nova] Acquiring lock "97067629-e099-49fd-bb79-223dd4401405-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-bbd8820a-8d83-45c3-b9f3-2471a0034d9e req-65a78248-1f45-492e-b263-3f8f98c2c447 service nova] Lock "97067629-e099-49fd-bb79-223dd4401405-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-bbd8820a-8d83-45c3-b9f3-2471a0034d9e req-65a78248-1f45-492e-b263-3f8f98c2c447 service nova] Lock "97067629-e099-49fd-bb79-223dd4401405-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG nova.compute.manager [req-bbd8820a-8d83-45c3-b9f3-2471a0034d9e req-65a78248-1f45-492e-b263-3f8f98c2c447 service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] No waiting events found dispatching network-vif-plugged-63dc9a41-e89e-4673-a658-7acddd88706f {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:10:55 user nova-compute[71972]: WARNING nova.compute.manager [req-bbd8820a-8d83-45c3-b9f3-2471a0034d9e req-65a78248-1f45-492e-b263-3f8f98c2c447 service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] Received unexpected event network-vif-plugged-63dc9a41-e89e-4673-a658-7acddd88706f for instance with vm_state active and task_state rescuing. Apr 17 22:10:55 user nova-compute[71972]: DEBUG nova.compute.manager [req-bbd8820a-8d83-45c3-b9f3-2471a0034d9e req-65a78248-1f45-492e-b263-3f8f98c2c447 service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] Received event network-vif-plugged-63dc9a41-e89e-4673-a658-7acddd88706f {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-bbd8820a-8d83-45c3-b9f3-2471a0034d9e req-65a78248-1f45-492e-b263-3f8f98c2c447 service nova] Acquiring lock "97067629-e099-49fd-bb79-223dd4401405-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-bbd8820a-8d83-45c3-b9f3-2471a0034d9e req-65a78248-1f45-492e-b263-3f8f98c2c447 service nova] Lock "97067629-e099-49fd-bb79-223dd4401405-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-bbd8820a-8d83-45c3-b9f3-2471a0034d9e req-65a78248-1f45-492e-b263-3f8f98c2c447 service nova] Lock "97067629-e099-49fd-bb79-223dd4401405-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG nova.compute.manager [req-bbd8820a-8d83-45c3-b9f3-2471a0034d9e req-65a78248-1f45-492e-b263-3f8f98c2c447 service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] No waiting events found dispatching network-vif-plugged-63dc9a41-e89e-4673-a658-7acddd88706f {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:10:55 user nova-compute[71972]: WARNING nova.compute.manager [req-bbd8820a-8d83-45c3-b9f3-2471a0034d9e req-65a78248-1f45-492e-b263-3f8f98c2c447 service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] Received unexpected event network-vif-plugged-63dc9a41-e89e-4673-a658-7acddd88706f for instance with vm_state active and task_state rescuing. Apr 17 22:10:55 user nova-compute[71972]: DEBUG nova.compute.manager [req-e81bb7db-323c-4325-b301-dede40aa29d8 req-78b759d2-4b61-47b5-8937-05571b04c6da service nova] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Received event network-vif-unplugged-11c4742a-b778-458f-9a76-1a8d6330f415 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-e81bb7db-323c-4325-b301-dede40aa29d8 req-78b759d2-4b61-47b5-8937-05571b04c6da service nova] Acquiring lock "cbaa3995-f00d-4194-b7e2-29bfc6e27614-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-e81bb7db-323c-4325-b301-dede40aa29d8 req-78b759d2-4b61-47b5-8937-05571b04c6da service nova] Lock "cbaa3995-f00d-4194-b7e2-29bfc6e27614-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-e81bb7db-323c-4325-b301-dede40aa29d8 req-78b759d2-4b61-47b5-8937-05571b04c6da service nova] Lock "cbaa3995-f00d-4194-b7e2-29bfc6e27614-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG nova.compute.manager [req-e81bb7db-323c-4325-b301-dede40aa29d8 req-78b759d2-4b61-47b5-8937-05571b04c6da service nova] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] No waiting events found dispatching network-vif-unplugged-11c4742a-b778-458f-9a76-1a8d6330f415 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG nova.compute.manager [req-e81bb7db-323c-4325-b301-dede40aa29d8 req-78b759d2-4b61-47b5-8937-05571b04c6da service nova] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Received event network-vif-unplugged-11c4742a-b778-458f-9a76-1a8d6330f415 for instance with task_state deleting. {{(pid=71972) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG nova.compute.manager [req-e81bb7db-323c-4325-b301-dede40aa29d8 req-78b759d2-4b61-47b5-8937-05571b04c6da service nova] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Received event network-vif-plugged-11c4742a-b778-458f-9a76-1a8d6330f415 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-e81bb7db-323c-4325-b301-dede40aa29d8 req-78b759d2-4b61-47b5-8937-05571b04c6da service nova] Acquiring lock "cbaa3995-f00d-4194-b7e2-29bfc6e27614-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-e81bb7db-323c-4325-b301-dede40aa29d8 req-78b759d2-4b61-47b5-8937-05571b04c6da service nova] Lock "cbaa3995-f00d-4194-b7e2-29bfc6e27614-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-e81bb7db-323c-4325-b301-dede40aa29d8 req-78b759d2-4b61-47b5-8937-05571b04c6da service nova] Lock "cbaa3995-f00d-4194-b7e2-29bfc6e27614-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG nova.compute.manager [req-e81bb7db-323c-4325-b301-dede40aa29d8 req-78b759d2-4b61-47b5-8937-05571b04c6da service nova] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] No waiting events found dispatching network-vif-plugged-11c4742a-b778-458f-9a76-1a8d6330f415 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:10:55 user nova-compute[71972]: WARNING nova.compute.manager [req-e81bb7db-323c-4325-b301-dede40aa29d8 req-78b759d2-4b61-47b5-8937-05571b04c6da service nova] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Received unexpected event network-vif-plugged-11c4742a-b778-458f-9a76-1a8d6330f415 for instance with vm_state active and task_state deleting. Apr 17 22:10:55 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:10:55 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Took 0.77 seconds to deallocate network for instance. Apr 17 22:10:55 user nova-compute[71972]: DEBUG nova.compute.manager [req-0c5e842b-e920-47df-82db-4d35a9344207 req-967e51d9-81da-4fe8-bf41-035f7010514e service nova] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Received event network-vif-deleted-11c4742a-b778-458f-9a76-1a8d6330f415 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:10:55 user nova-compute[71972]: INFO nova.compute.manager [req-0c5e842b-e920-47df-82db-4d35a9344207 req-967e51d9-81da-4fe8-bf41-035f7010514e service nova] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Neutron deleted interface 11c4742a-b778-458f-9a76-1a8d6330f415; detaching it from the instance and deleting it from the info cache Apr 17 22:10:55 user nova-compute[71972]: DEBUG nova.network.neutron [req-0c5e842b-e920-47df-82db-4d35a9344207 req-967e51d9-81da-4fe8-bf41-035f7010514e service nova] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG nova.compute.manager [req-0c5e842b-e920-47df-82db-4d35a9344207 req-967e51d9-81da-4fe8-bf41-035f7010514e service nova] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Detach interface failed, port_id=11c4742a-b778-458f-9a76-1a8d6330f415, reason: Instance cbaa3995-f00d-4194-b7e2-29bfc6e27614 could not be found. {{(pid=71972) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7baa820d-07d5-4349-9105-c176b30ad0fd tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7baa820d-07d5-4349-9105-c176b30ad0fd tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG nova.virt.libvirt.host [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Removed pending event for 97067629-e099-49fd-bb79-223dd4401405 due to event {{(pid=71972) _event_emit_delayed /opt/stack/nova/nova/virt/libvirt/host.py:438}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Resumed> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:10:55 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 97067629-e099-49fd-bb79-223dd4401405] VM Resumed (Lifecycle Event) Apr 17 22:10:55 user nova-compute[71972]: DEBUG nova.compute.manager [None req-90240a6f-9b61-4517-b4af-9d218638950a tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 97067629-e099-49fd-bb79-223dd4401405] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 97067629-e099-49fd-bb79-223dd4401405] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:10:55 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 97067629-e099-49fd-bb79-223dd4401405] During sync_power_state the instance has a pending task (rescuing). Skip. Apr 17 22:10:55 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Started> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:10:55 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 97067629-e099-49fd-bb79-223dd4401405] VM Started (Lifecycle Event) Apr 17 22:10:55 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 97067629-e099-49fd-bb79-223dd4401405] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 97067629-e099-49fd-bb79-223dd4401405] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-7baa820d-07d5-4349-9105-c176b30ad0fd tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-7baa820d-07d5-4349-9105-c176b30ad0fd tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:10:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7baa820d-07d5-4349-9105-c176b30ad0fd tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.363s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:55 user nova-compute[71972]: INFO nova.scheduler.client.report [None req-7baa820d-07d5-4349-9105-c176b30ad0fd tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Deleted allocations for instance cbaa3995-f00d-4194-b7e2-29bfc6e27614 Apr 17 22:10:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7baa820d-07d5-4349-9105-c176b30ad0fd tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "cbaa3995-f00d-4194-b7e2-29bfc6e27614" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.969s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:10:57 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:10:59 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:02 user nova-compute[71972]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:11:02 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] VM Stopped (Lifecycle Event) Apr 17 22:11:02 user nova-compute[71972]: DEBUG nova.compute.manager [None req-13d0082f-37e6-43d3-997a-1aae28bcbfa5 None None] [instance: c4fc7798-567a-4002-b056-6c4f02d0e955] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:11:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:03 user nova-compute[71972]: DEBUG nova.compute.manager [req-185733f7-ede3-4bbe-84e7-ce0f45035933 req-bd481863-61a5-4a29-a00a-a79da61274a5 service nova] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Received event network-changed-ece1440f-e4b0-4eef-9ab4-6a741473bf2f {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:11:03 user nova-compute[71972]: DEBUG nova.compute.manager [req-185733f7-ede3-4bbe-84e7-ce0f45035933 req-bd481863-61a5-4a29-a00a-a79da61274a5 service nova] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Refreshing instance network info cache due to event network-changed-ece1440f-e4b0-4eef-9ab4-6a741473bf2f. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:11:03 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-185733f7-ede3-4bbe-84e7-ce0f45035933 req-bd481863-61a5-4a29-a00a-a79da61274a5 service nova] Acquiring lock "refresh_cache-b2751b9c-c966-416d-aaaa-81756198849c" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:11:03 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-185733f7-ede3-4bbe-84e7-ce0f45035933 req-bd481863-61a5-4a29-a00a-a79da61274a5 service nova] Acquired lock "refresh_cache-b2751b9c-c966-416d-aaaa-81756198849c" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:11:03 user nova-compute[71972]: DEBUG nova.network.neutron [req-185733f7-ede3-4bbe-84e7-ce0f45035933 req-bd481863-61a5-4a29-a00a-a79da61274a5 service nova] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Refreshing network info cache for port ece1440f-e4b0-4eef-9ab4-6a741473bf2f {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:11:03 user nova-compute[71972]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:11:03 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] VM Stopped (Lifecycle Event) Apr 17 22:11:03 user nova-compute[71972]: DEBUG nova.compute.manager [None req-90f277e8-94fb-4eb1-8f34-cb897416a58b None None] [instance: 3681b009-1a99-4eb8-b189-3fe0647f5d1d] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:11:03 user nova-compute[71972]: DEBUG nova.network.neutron [req-185733f7-ede3-4bbe-84e7-ce0f45035933 req-bd481863-61a5-4a29-a00a-a79da61274a5 service nova] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Updated VIF entry in instance network info cache for port ece1440f-e4b0-4eef-9ab4-6a741473bf2f. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:11:03 user nova-compute[71972]: DEBUG nova.network.neutron [req-185733f7-ede3-4bbe-84e7-ce0f45035933 req-bd481863-61a5-4a29-a00a-a79da61274a5 service nova] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Updating instance_info_cache with network_info: [{"id": "ece1440f-e4b0-4eef-9ab4-6a741473bf2f", "address": "fa:16:3e:96:4d:19", "network": {"id": "d5f5ecb0-5758-4318-bb86-c30fc214049a", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-43150130-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fd70480cd5364a3185fe097f88c290ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapece1440f-e4", "ovs_interfaceid": "ece1440f-e4b0-4eef-9ab4-6a741473bf2f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:11:03 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-185733f7-ede3-4bbe-84e7-ce0f45035933 req-bd481863-61a5-4a29-a00a-a79da61274a5 service nova] Releasing lock "refresh_cache-b2751b9c-c966-416d-aaaa-81756198849c" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:11:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:04 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b56ec2a6-fbff-45df-83f6-acae8aa6e6da tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Acquiring lock "b2751b9c-c966-416d-aaaa-81756198849c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:04 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b56ec2a6-fbff-45df-83f6-acae8aa6e6da tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Lock "b2751b9c-c966-416d-aaaa-81756198849c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:04 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b56ec2a6-fbff-45df-83f6-acae8aa6e6da tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Acquiring lock "b2751b9c-c966-416d-aaaa-81756198849c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:04 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b56ec2a6-fbff-45df-83f6-acae8aa6e6da tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Lock "b2751b9c-c966-416d-aaaa-81756198849c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:04 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b56ec2a6-fbff-45df-83f6-acae8aa6e6da tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Lock "b2751b9c-c966-416d-aaaa-81756198849c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:04 user nova-compute[71972]: INFO nova.compute.manager [None req-b56ec2a6-fbff-45df-83f6-acae8aa6e6da tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Terminating instance Apr 17 22:11:04 user nova-compute[71972]: DEBUG nova.compute.manager [None req-b56ec2a6-fbff-45df-83f6-acae8aa6e6da tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Start destroying the instance on the hypervisor. {{(pid=71972) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 22:11:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:05 user nova-compute[71972]: DEBUG nova.compute.manager [req-fcb9b23e-4601-41d2-9207-1e8148bf9b0c req-9666c1d0-ceef-4c50-bbc9-0d9b9dee975b service nova] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Received event network-vif-unplugged-ece1440f-e4b0-4eef-9ab4-6a741473bf2f {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:11:05 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-fcb9b23e-4601-41d2-9207-1e8148bf9b0c req-9666c1d0-ceef-4c50-bbc9-0d9b9dee975b service nova] Acquiring lock "b2751b9c-c966-416d-aaaa-81756198849c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:05 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-fcb9b23e-4601-41d2-9207-1e8148bf9b0c req-9666c1d0-ceef-4c50-bbc9-0d9b9dee975b service nova] Lock "b2751b9c-c966-416d-aaaa-81756198849c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:05 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-fcb9b23e-4601-41d2-9207-1e8148bf9b0c req-9666c1d0-ceef-4c50-bbc9-0d9b9dee975b service nova] Lock "b2751b9c-c966-416d-aaaa-81756198849c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:05 user nova-compute[71972]: DEBUG nova.compute.manager [req-fcb9b23e-4601-41d2-9207-1e8148bf9b0c req-9666c1d0-ceef-4c50-bbc9-0d9b9dee975b service nova] [instance: b2751b9c-c966-416d-aaaa-81756198849c] No waiting events found dispatching network-vif-unplugged-ece1440f-e4b0-4eef-9ab4-6a741473bf2f {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:11:05 user nova-compute[71972]: DEBUG nova.compute.manager [req-fcb9b23e-4601-41d2-9207-1e8148bf9b0c req-9666c1d0-ceef-4c50-bbc9-0d9b9dee975b service nova] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Received event network-vif-unplugged-ece1440f-e4b0-4eef-9ab4-6a741473bf2f for instance with task_state deleting. {{(pid=71972) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 22:11:05 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Instance destroyed successfully. Apr 17 22:11:05 user nova-compute[71972]: DEBUG nova.objects.instance [None req-b56ec2a6-fbff-45df-83f6-acae8aa6e6da tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Lazy-loading 'resources' on Instance uuid b2751b9c-c966-416d-aaaa-81756198849c {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:11:05 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-b56ec2a6-fbff-45df-83f6-acae8aa6e6da tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2023-04-17T22:09:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-518621003',display_name='tempest-AttachSCSIVolumeTestJSON-server-518621003',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachscsivolumetestjson-server-518621003',id=8,image_ref='caf3ea13-92a9-40f8-bd4a-51f6b5c53327',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAXWDpAmHJN6DEa8aBiSGZSIAf3NwsuJLBqtY30pGhgpUFLMNyDpGDxGPXPI9IR47QvVMEhsqH5smKDhL4yjmsCvpuEmnNtj1uTFOgYDJS1VE0IAdB4c78I6NkPJBVnexQ==',key_name='tempest-keypair-1545084897',keypairs=,launch_index=0,launched_at=2023-04-17T22:09:29Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='fd70480cd5364a3185fe097f88c290ae',ramdisk_id='',reservation_id='r-x3b00iug',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='caf3ea13-92a9-40f8-bd4a-51f6b5c53327',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_scsi_model='virtio-scsi',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachSCSIVolumeTestJSON-885264932',owner_user_name='tempest-AttachSCSIVolumeTestJSON-885264932-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T22:09:29Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='79a02ae084a541b1b7a3fda0190b9ae4',uuid=b2751b9c-c966-416d-aaaa-81756198849c,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ece1440f-e4b0-4eef-9ab4-6a741473bf2f", "address": "fa:16:3e:96:4d:19", "network": {"id": "d5f5ecb0-5758-4318-bb86-c30fc214049a", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-43150130-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fd70480cd5364a3185fe097f88c290ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapece1440f-e4", "ovs_interfaceid": "ece1440f-e4b0-4eef-9ab4-6a741473bf2f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 22:11:05 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-b56ec2a6-fbff-45df-83f6-acae8aa6e6da tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Converting VIF {"id": "ece1440f-e4b0-4eef-9ab4-6a741473bf2f", "address": "fa:16:3e:96:4d:19", "network": {"id": "d5f5ecb0-5758-4318-bb86-c30fc214049a", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-43150130-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "fd70480cd5364a3185fe097f88c290ae", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapece1440f-e4", "ovs_interfaceid": "ece1440f-e4b0-4eef-9ab4-6a741473bf2f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:11:05 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-b56ec2a6-fbff-45df-83f6-acae8aa6e6da tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:96:4d:19,bridge_name='br-int',has_traffic_filtering=True,id=ece1440f-e4b0-4eef-9ab4-6a741473bf2f,network=Network(d5f5ecb0-5758-4318-bb86-c30fc214049a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapece1440f-e4') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:11:05 user nova-compute[71972]: DEBUG os_vif [None req-b56ec2a6-fbff-45df-83f6-acae8aa6e6da tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:96:4d:19,bridge_name='br-int',has_traffic_filtering=True,id=ece1440f-e4b0-4eef-9ab4-6a741473bf2f,network=Network(d5f5ecb0-5758-4318-bb86-c30fc214049a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapece1440f-e4') {{(pid=71972) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 22:11:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapece1440f-e4, bridge=br-int, if_exists=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:11:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:11:05 user nova-compute[71972]: INFO os_vif [None req-b56ec2a6-fbff-45df-83f6-acae8aa6e6da tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:96:4d:19,bridge_name='br-int',has_traffic_filtering=True,id=ece1440f-e4b0-4eef-9ab4-6a741473bf2f,network=Network(d5f5ecb0-5758-4318-bb86-c30fc214049a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapece1440f-e4') Apr 17 22:11:05 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-b56ec2a6-fbff-45df-83f6-acae8aa6e6da tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Deleting instance files /opt/stack/data/nova/instances/b2751b9c-c966-416d-aaaa-81756198849c_del Apr 17 22:11:05 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-b56ec2a6-fbff-45df-83f6-acae8aa6e6da tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Deletion of /opt/stack/data/nova/instances/b2751b9c-c966-416d-aaaa-81756198849c_del complete Apr 17 22:11:05 user nova-compute[71972]: INFO nova.compute.manager [None req-b56ec2a6-fbff-45df-83f6-acae8aa6e6da tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Took 0.65 seconds to destroy the instance on the hypervisor. Apr 17 22:11:05 user nova-compute[71972]: DEBUG oslo.service.loopingcall [None req-b56ec2a6-fbff-45df-83f6-acae8aa6e6da tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71972) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 22:11:05 user nova-compute[71972]: DEBUG nova.compute.manager [-] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Deallocating network for instance {{(pid=71972) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 22:11:05 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: b2751b9c-c966-416d-aaaa-81756198849c] deallocate_for_instance() {{(pid=71972) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 22:11:06 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:11:06 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Took 0.84 seconds to deallocate network for instance. Apr 17 22:11:06 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b56ec2a6-fbff-45df-83f6-acae8aa6e6da tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:06 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b56ec2a6-fbff-45df-83f6-acae8aa6e6da tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:06 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-b56ec2a6-fbff-45df-83f6-acae8aa6e6da tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:11:06 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-b56ec2a6-fbff-45df-83f6-acae8aa6e6da tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:11:06 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b56ec2a6-fbff-45df-83f6-acae8aa6e6da tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.293s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:06 user nova-compute[71972]: INFO nova.scheduler.client.report [None req-b56ec2a6-fbff-45df-83f6-acae8aa6e6da tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Deleted allocations for instance b2751b9c-c966-416d-aaaa-81756198849c Apr 17 22:11:06 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b56ec2a6-fbff-45df-83f6-acae8aa6e6da tempest-AttachSCSIVolumeTestJSON-885264932 tempest-AttachSCSIVolumeTestJSON-885264932-project-member] Lock "b2751b9c-c966-416d-aaaa-81756198849c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.965s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:07 user nova-compute[71972]: DEBUG nova.compute.manager [req-e202708b-22d9-430e-a89d-b38209697942 req-20877285-de0a-492c-b6f5-b694c0a51366 service nova] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Received event network-vif-plugged-ece1440f-e4b0-4eef-9ab4-6a741473bf2f {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:11:07 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-e202708b-22d9-430e-a89d-b38209697942 req-20877285-de0a-492c-b6f5-b694c0a51366 service nova] Acquiring lock "b2751b9c-c966-416d-aaaa-81756198849c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:07 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-e202708b-22d9-430e-a89d-b38209697942 req-20877285-de0a-492c-b6f5-b694c0a51366 service nova] Lock "b2751b9c-c966-416d-aaaa-81756198849c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:07 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-e202708b-22d9-430e-a89d-b38209697942 req-20877285-de0a-492c-b6f5-b694c0a51366 service nova] Lock "b2751b9c-c966-416d-aaaa-81756198849c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:07 user nova-compute[71972]: DEBUG nova.compute.manager [req-e202708b-22d9-430e-a89d-b38209697942 req-20877285-de0a-492c-b6f5-b694c0a51366 service nova] [instance: b2751b9c-c966-416d-aaaa-81756198849c] No waiting events found dispatching network-vif-plugged-ece1440f-e4b0-4eef-9ab4-6a741473bf2f {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:11:07 user nova-compute[71972]: WARNING nova.compute.manager [req-e202708b-22d9-430e-a89d-b38209697942 req-20877285-de0a-492c-b6f5-b694c0a51366 service nova] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Received unexpected event network-vif-plugged-ece1440f-e4b0-4eef-9ab4-6a741473bf2f for instance with vm_state deleted and task_state None. Apr 17 22:11:07 user nova-compute[71972]: DEBUG nova.compute.manager [req-e202708b-22d9-430e-a89d-b38209697942 req-20877285-de0a-492c-b6f5-b694c0a51366 service nova] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Received event network-vif-deleted-ece1440f-e4b0-4eef-9ab4-6a741473bf2f {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:11:07 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:09 user nova-compute[71972]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:11:09 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] VM Stopped (Lifecycle Event) Apr 17 22:11:09 user nova-compute[71972]: DEBUG nova.compute.manager [None req-455558d7-d92c-4e3f-b57f-c75ab8e8deb9 None None] [instance: cbaa3995-f00d-4194-b7e2-29bfc6e27614] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:11:10 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:12 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:15 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:17 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:20 user nova-compute[71972]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:11:20 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: b2751b9c-c966-416d-aaaa-81756198849c] VM Stopped (Lifecycle Event) Apr 17 22:11:20 user nova-compute[71972]: DEBUG nova.compute.manager [None req-12b95077-25e8-4f63-861e-d52bb5138edd None None] [instance: b2751b9c-c966-416d-aaaa-81756198849c] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:11:20 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:22 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:25 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:27 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:30 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:35 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:35 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:11:35 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:11:36 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:11:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:36 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Auditing locally available compute resources for user (node: user) {{(pid=71972) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 22:11:36 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:36 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:36 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk.rescue --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk.rescue --force-share --output=json" returned: 0 in 0.157s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk.rescue --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk.rescue --force-share --output=json" returned: 0 in 0.152s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:37 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0c76babb-25fa-4e8b-9e09-c705153a95e5/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0c76babb-25fa-4e8b-9e09-c705153a95e5/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0c76babb-25fa-4e8b-9e09-c705153a95e5/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/0c76babb-25fa-4e8b-9e09-c705153a95e5/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/04e443ff-f9f1-4424-847d-ea7557307ec6/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/04e443ff-f9f1-4424-847d-ea7557307ec6/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/04e443ff-f9f1-4424-847d-ea7557307ec6/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/04e443ff-f9f1-4424-847d-ea7557307ec6/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d9386728-7c3f-42ff-8f1c-51748ccefff3/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b3e4db78-f2d4-404f-9a7f-057ee77182de tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Acquiring lock "0c76babb-25fa-4e8b-9e09-c705153a95e5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b3e4db78-f2d4-404f-9a7f-057ee77182de tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Lock "0c76babb-25fa-4e8b-9e09-c705153a95e5" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.002s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b3e4db78-f2d4-404f-9a7f-057ee77182de tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Acquiring lock "0c76babb-25fa-4e8b-9e09-c705153a95e5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b3e4db78-f2d4-404f-9a7f-057ee77182de tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Lock "0c76babb-25fa-4e8b-9e09-c705153a95e5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b3e4db78-f2d4-404f-9a7f-057ee77182de tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Lock "0c76babb-25fa-4e8b-9e09-c705153a95e5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:38 user nova-compute[71972]: INFO nova.compute.manager [None req-b3e4db78-f2d4-404f-9a7f-057ee77182de tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Terminating instance Apr 17 22:11:38 user nova-compute[71972]: DEBUG nova.compute.manager [None req-b3e4db78-f2d4-404f-9a7f-057ee77182de tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Start destroying the instance on the hypervisor. {{(pid=71972) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 22:11:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d9386728-7c3f-42ff-8f1c-51748ccefff3/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d9386728-7c3f-42ff-8f1c-51748ccefff3/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-61950bd7-8437-47a9-bb83-0b2973735d96 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Acquiring lock "04e443ff-f9f1-4424-847d-ea7557307ec6" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-61950bd7-8437-47a9-bb83-0b2973735d96 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "04e443ff-f9f1-4424-847d-ea7557307ec6" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.002s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-61950bd7-8437-47a9-bb83-0b2973735d96 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Acquiring lock "04e443ff-f9f1-4424-847d-ea7557307ec6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-61950bd7-8437-47a9-bb83-0b2973735d96 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "04e443ff-f9f1-4424-847d-ea7557307ec6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-61950bd7-8437-47a9-bb83-0b2973735d96 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "04e443ff-f9f1-4424-847d-ea7557307ec6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:38 user nova-compute[71972]: INFO nova.compute.manager [None req-61950bd7-8437-47a9-bb83-0b2973735d96 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Terminating instance Apr 17 22:11:38 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:38 user nova-compute[71972]: DEBUG nova.compute.manager [None req-61950bd7-8437-47a9-bb83-0b2973735d96 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Start destroying the instance on the hypervisor. {{(pid=71972) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 22:11:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d9386728-7c3f-42ff-8f1c-51748ccefff3/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:38 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:38 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:39 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json" returned: 0 in 0.163s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:39 user nova-compute[71972]: DEBUG nova.compute.manager [req-f87f7028-d1c4-4323-aa56-034c5ca1614a req-86b89c49-14ae-4a35-9a6d-6d6c6f0616d8 service nova] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Received event network-vif-unplugged-4629d4b1-f472-4302-9bf6-94f62369c1c1 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:11:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-f87f7028-d1c4-4323-aa56-034c5ca1614a req-86b89c49-14ae-4a35-9a6d-6d6c6f0616d8 service nova] Acquiring lock "0c76babb-25fa-4e8b-9e09-c705153a95e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-f87f7028-d1c4-4323-aa56-034c5ca1614a req-86b89c49-14ae-4a35-9a6d-6d6c6f0616d8 service nova] Lock "0c76babb-25fa-4e8b-9e09-c705153a95e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-f87f7028-d1c4-4323-aa56-034c5ca1614a req-86b89c49-14ae-4a35-9a6d-6d6c6f0616d8 service nova] Lock "0c76babb-25fa-4e8b-9e09-c705153a95e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:39 user nova-compute[71972]: DEBUG nova.compute.manager [req-f87f7028-d1c4-4323-aa56-034c5ca1614a req-86b89c49-14ae-4a35-9a6d-6d6c6f0616d8 service nova] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] No waiting events found dispatching network-vif-unplugged-4629d4b1-f472-4302-9bf6-94f62369c1c1 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:11:39 user nova-compute[71972]: DEBUG nova.compute.manager [req-f87f7028-d1c4-4323-aa56-034c5ca1614a req-86b89c49-14ae-4a35-9a6d-6d6c6f0616d8 service nova] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Received event network-vif-unplugged-4629d4b1-f472-4302-9bf6-94f62369c1c1 for instance with task_state deleting. {{(pid=71972) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 22:11:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7c93039-48e6-40b8-b921-d6eb1ebe78ef/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:39 user nova-compute[71972]: DEBUG nova.compute.manager [req-02672e0e-89f9-4e8c-b0e6-334c982cc5ce req-45bc6226-9736-41b1-8315-7c3002246d36 service nova] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Received event network-vif-unplugged-1f194f69-5d98-4774-958a-1b5b81d978d3 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:11:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-02672e0e-89f9-4e8c-b0e6-334c982cc5ce req-45bc6226-9736-41b1-8315-7c3002246d36 service nova] Acquiring lock "04e443ff-f9f1-4424-847d-ea7557307ec6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-02672e0e-89f9-4e8c-b0e6-334c982cc5ce req-45bc6226-9736-41b1-8315-7c3002246d36 service nova] Lock "04e443ff-f9f1-4424-847d-ea7557307ec6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-02672e0e-89f9-4e8c-b0e6-334c982cc5ce req-45bc6226-9736-41b1-8315-7c3002246d36 service nova] Lock "04e443ff-f9f1-4424-847d-ea7557307ec6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:39 user nova-compute[71972]: DEBUG nova.compute.manager [req-02672e0e-89f9-4e8c-b0e6-334c982cc5ce req-45bc6226-9736-41b1-8315-7c3002246d36 service nova] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] No waiting events found dispatching network-vif-unplugged-1f194f69-5d98-4774-958a-1b5b81d978d3 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:11:39 user nova-compute[71972]: DEBUG nova.compute.manager [req-02672e0e-89f9-4e8c-b0e6-334c982cc5ce req-45bc6226-9736-41b1-8315-7c3002246d36 service nova] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Received event network-vif-unplugged-1f194f69-5d98-4774-958a-1b5b81d978d3 for instance with task_state deleting. {{(pid=71972) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 22:11:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7c93039-48e6-40b8-b921-d6eb1ebe78ef/disk --force-share --output=json" returned: 0 in 0.163s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7c93039-48e6-40b8-b921-d6eb1ebe78ef/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d7c93039-48e6-40b8-b921-d6eb1ebe78ef/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:40 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Instance destroyed successfully. Apr 17 22:11:40 user nova-compute[71972]: DEBUG nova.objects.instance [None req-b3e4db78-f2d4-404f-9a7f-057ee77182de tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Lazy-loading 'resources' on Instance uuid 0c76babb-25fa-4e8b-9e09-c705153a95e5 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:11:40 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:11:40 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:11:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Hypervisor/Node resource view: name=user free_ram=8049MB free_disk=26.520885467529297GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71972) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:40 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Instance destroyed successfully. Apr 17 22:11:40 user nova-compute[71972]: DEBUG nova.objects.instance [None req-61950bd7-8437-47a9-bb83-0b2973735d96 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lazy-loading 'resources' on Instance uuid 04e443ff-f9f1-4424-847d-ea7557307ec6 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-b3e4db78-f2d4-404f-9a7f-057ee77182de tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:09:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesActionsTest-instance-1085383727',display_name='tempest-VolumesActionsTest-instance-1085383727',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesactionstest-instance-1085383727',id=10,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-17T22:09:55Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='54885ecab9394b39a2a0d287761eda71',ramdisk_id='',reservation_id='r-eucmfo5p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesActionsTest-217042055',owner_user_name='tempest-VolumesActionsTest-217042055-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T22:09:56Z,user_data=None,user_id='74aabdff0142403bbb29d96ad103d2f8',uuid=0c76babb-25fa-4e8b-9e09-c705153a95e5,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4629d4b1-f472-4302-9bf6-94f62369c1c1", "address": "fa:16:3e:a8:48:3e", "network": {"id": "d8b9fec1-bd04-4ad0-a353-9e6f92eee9dd", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1408052112-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "54885ecab9394b39a2a0d287761eda71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4629d4b1-f4", "ovs_interfaceid": "4629d4b1-f472-4302-9bf6-94f62369c1c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-b3e4db78-f2d4-404f-9a7f-057ee77182de tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Converting VIF {"id": "4629d4b1-f472-4302-9bf6-94f62369c1c1", "address": "fa:16:3e:a8:48:3e", "network": {"id": "d8b9fec1-bd04-4ad0-a353-9e6f92eee9dd", "bridge": "br-int", "label": "tempest-VolumesActionsTest-1408052112-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {}}], "meta": {"injected": false, "tenant_id": "54885ecab9394b39a2a0d287761eda71", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4629d4b1-f4", "ovs_interfaceid": "4629d4b1-f472-4302-9bf6-94f62369c1c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-b3e4db78-f2d4-404f-9a7f-057ee77182de tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:48:3e,bridge_name='br-int',has_traffic_filtering=True,id=4629d4b1-f472-4302-9bf6-94f62369c1c1,network=Network(d8b9fec1-bd04-4ad0-a353-9e6f92eee9dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4629d4b1-f4') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG os_vif [None req-b3e4db78-f2d4-404f-9a7f-057ee77182de tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:48:3e,bridge_name='br-int',has_traffic_filtering=True,id=4629d4b1-f472-4302-9bf6-94f62369c1c1,network=Network(d8b9fec1-bd04-4ad0-a353-9e6f92eee9dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4629d4b1-f4') {{(pid=71972) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4629d4b1-f4, bridge=br-int, if_exists=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-61950bd7-8437-47a9-bb83-0b2973735d96 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:09:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1918210233',display_name='tempest-TestMinimumBasicScenario-server-1918210233',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1918210233',id=9,image_ref='56106bc9-e4d5-47fb-b7a0-f8aadd59975a',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBByiPN5htPjbADhNBJJivdJ60suVrHNjFB0wkgRGT7xEfBp6fHU53P+o170ZE919d8wE4Y4BNeLjkXtGWbkhkDeRYSJIkjhlyNjlASEDFcsNq/wi6hmW0/zfJjZX/wd3og==',key_name='tempest-TestMinimumBasicScenario-706092591',keypairs=,launch_index=0,launched_at=2023-04-17T22:09:53Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='a82db257b3494faca3f3759644a51b30',ramdisk_id='',reservation_id='r-nri78qe3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='56106bc9-e4d5-47fb-b7a0-f8aadd59975a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-475067891',owner_user_name='tempest-TestMinimumBasicScenario-475067891-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T22:09:53Z,user_data=None,user_id='e11872391f1a487a8a8ba5a6d13589f1',uuid=04e443ff-f9f1-4424-847d-ea7557307ec6,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1f194f69-5d98-4774-958a-1b5b81d978d3", "address": "fa:16:3e:21:0f:6f", "network": {"id": "a91184a7-8ecf-419a-a191-2853dd054c4b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1266987381-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a82db257b3494faca3f3759644a51b30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f194f69-5d", "ovs_interfaceid": "1f194f69-5d98-4774-958a-1b5b81d978d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-61950bd7-8437-47a9-bb83-0b2973735d96 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Converting VIF {"id": "1f194f69-5d98-4774-958a-1b5b81d978d3", "address": "fa:16:3e:21:0f:6f", "network": {"id": "a91184a7-8ecf-419a-a191-2853dd054c4b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1266987381-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a82db257b3494faca3f3759644a51b30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f194f69-5d", "ovs_interfaceid": "1f194f69-5d98-4774-958a-1b5b81d978d3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-61950bd7-8437-47a9-bb83-0b2973735d96 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:21:0f:6f,bridge_name='br-int',has_traffic_filtering=True,id=1f194f69-5d98-4774-958a-1b5b81d978d3,network=Network(a91184a7-8ecf-419a-a191-2853dd054c4b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f194f69-5d') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG os_vif [None req-61950bd7-8437-47a9-bb83-0b2973735d96 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:21:0f:6f,bridge_name='br-int',has_traffic_filtering=True,id=1f194f69-5d98-4774-958a-1b5b81d978d3,network=Network(a91184a7-8ecf-419a-a191-2853dd054c4b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f194f69-5d') {{(pid=71972) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f194f69-5d, bridge=br-int, if_exists=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:40 user nova-compute[71972]: INFO os_vif [None req-b3e4db78-f2d4-404f-9a7f-057ee77182de tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:48:3e,bridge_name='br-int',has_traffic_filtering=True,id=4629d4b1-f472-4302-9bf6-94f62369c1c1,network=Network(d8b9fec1-bd04-4ad0-a353-9e6f92eee9dd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4629d4b1-f4') Apr 17 22:11:40 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-b3e4db78-f2d4-404f-9a7f-057ee77182de tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Deleting instance files /opt/stack/data/nova/instances/0c76babb-25fa-4e8b-9e09-c705153a95e5_del Apr 17 22:11:40 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-b3e4db78-f2d4-404f-9a7f-057ee77182de tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Deletion of /opt/stack/data/nova/instances/0c76babb-25fa-4e8b-9e09-c705153a95e5_del complete Apr 17 22:11:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:11:40 user nova-compute[71972]: INFO os_vif [None req-61950bd7-8437-47a9-bb83-0b2973735d96 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:21:0f:6f,bridge_name='br-int',has_traffic_filtering=True,id=1f194f69-5d98-4774-958a-1b5b81d978d3,network=Network(a91184a7-8ecf-419a-a191-2853dd054c4b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f194f69-5d') Apr 17 22:11:40 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-61950bd7-8437-47a9-bb83-0b2973735d96 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Deleting instance files /opt/stack/data/nova/instances/04e443ff-f9f1-4424-847d-ea7557307ec6_del Apr 17 22:11:40 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-61950bd7-8437-47a9-bb83-0b2973735d96 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Deletion of /opt/stack/data/nova/instances/04e443ff-f9f1-4424-847d-ea7557307ec6_del complete Apr 17 22:11:40 user nova-compute[71972]: INFO nova.compute.manager [None req-b3e4db78-f2d4-404f-9a7f-057ee77182de tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Took 1.73 seconds to destroy the instance on the hypervisor. Apr 17 22:11:40 user nova-compute[71972]: DEBUG oslo.service.loopingcall [None req-b3e4db78-f2d4-404f-9a7f-057ee77182de tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71972) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG nova.compute.manager [-] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Deallocating network for instance {{(pid=71972) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] deallocate_for_instance() {{(pid=71972) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 22:11:40 user nova-compute[71972]: INFO nova.compute.manager [None req-61950bd7-8437-47a9-bb83-0b2973735d96 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Took 1.61 seconds to destroy the instance on the hypervisor. Apr 17 22:11:40 user nova-compute[71972]: DEBUG oslo.service.loopingcall [None req-61950bd7-8437-47a9-bb83-0b2973735d96 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71972) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG nova.compute.manager [-] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Deallocating network for instance {{(pid=71972) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] deallocate_for_instance() {{(pid=71972) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 7bb0580b-727f-4168-9d56-56dcb4fa404e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 99cb7131-abb8-41d6-bddd-c3bc943b7678 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 9634492c-168d-4b49-941a-b89703571b73 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 97067629-e099-49fd-bb79-223dd4401405 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 04e443ff-f9f1-4424-847d-ea7557307ec6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 0c76babb-25fa-4e8b-9e09-c705153a95e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance d9386728-7c3f-42ff-8f1c-51748ccefff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance d7c93039-48e6-40b8-b921-d6eb1ebe78ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Total usable vcpus: 12, total allocated vcpus: 8 {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Final resource view: name=user phys_ram=16023MB used_ram=1536MB phys_disk=40GB used_disk=8GB total_vcpus=12 used_vcpus=8 pci_stats=[] {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Compute_service record updated for user:user {{(pid=71972) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 22:11:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.502s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:41 user nova-compute[71972]: DEBUG nova.compute.manager [req-4bb8873c-0248-4eca-81e4-54f238cd1842 req-67bd5479-0c81-4d81-b859-294b17f35575 service nova] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Received event network-vif-plugged-4629d4b1-f472-4302-9bf6-94f62369c1c1 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:11:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-4bb8873c-0248-4eca-81e4-54f238cd1842 req-67bd5479-0c81-4d81-b859-294b17f35575 service nova] Acquiring lock "0c76babb-25fa-4e8b-9e09-c705153a95e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-4bb8873c-0248-4eca-81e4-54f238cd1842 req-67bd5479-0c81-4d81-b859-294b17f35575 service nova] Lock "0c76babb-25fa-4e8b-9e09-c705153a95e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-4bb8873c-0248-4eca-81e4-54f238cd1842 req-67bd5479-0c81-4d81-b859-294b17f35575 service nova] Lock "0c76babb-25fa-4e8b-9e09-c705153a95e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:41 user nova-compute[71972]: DEBUG nova.compute.manager [req-4bb8873c-0248-4eca-81e4-54f238cd1842 req-67bd5479-0c81-4d81-b859-294b17f35575 service nova] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] No waiting events found dispatching network-vif-plugged-4629d4b1-f472-4302-9bf6-94f62369c1c1 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:11:41 user nova-compute[71972]: WARNING nova.compute.manager [req-4bb8873c-0248-4eca-81e4-54f238cd1842 req-67bd5479-0c81-4d81-b859-294b17f35575 service nova] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Received unexpected event network-vif-plugged-4629d4b1-f472-4302-9bf6-94f62369c1c1 for instance with vm_state active and task_state deleting. Apr 17 22:11:41 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:11:41 user nova-compute[71972]: DEBUG nova.compute.manager [req-326babc0-2492-45e1-bcab-b1d60d1f488f req-04f42e9d-d9df-46ea-ba31-eb4d3e4ffb38 service nova] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Received event network-vif-deleted-1f194f69-5d98-4774-958a-1b5b81d978d3 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:11:41 user nova-compute[71972]: INFO nova.compute.manager [req-326babc0-2492-45e1-bcab-b1d60d1f488f req-04f42e9d-d9df-46ea-ba31-eb4d3e4ffb38 service nova] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Neutron deleted interface 1f194f69-5d98-4774-958a-1b5b81d978d3; detaching it from the instance and deleting it from the info cache Apr 17 22:11:41 user nova-compute[71972]: DEBUG nova.network.neutron [req-326babc0-2492-45e1-bcab-b1d60d1f488f req-04f42e9d-d9df-46ea-ba31-eb4d3e4ffb38 service nova] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:11:41 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Took 0.90 seconds to deallocate network for instance. Apr 17 22:11:41 user nova-compute[71972]: DEBUG nova.compute.manager [req-326babc0-2492-45e1-bcab-b1d60d1f488f req-04f42e9d-d9df-46ea-ba31-eb4d3e4ffb38 service nova] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Detach interface failed, port_id=1f194f69-5d98-4774-958a-1b5b81d978d3, reason: Instance 04e443ff-f9f1-4424-847d-ea7557307ec6 could not be found. {{(pid=71972) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 17 22:11:41 user nova-compute[71972]: DEBUG nova.compute.manager [req-3d9e3d6a-e88a-498f-b2b3-e207ce69a436 req-cec95bda-f7b8-45eb-911a-44b7a936c404 service nova] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Received event network-vif-plugged-1f194f69-5d98-4774-958a-1b5b81d978d3 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:11:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-3d9e3d6a-e88a-498f-b2b3-e207ce69a436 req-cec95bda-f7b8-45eb-911a-44b7a936c404 service nova] Acquiring lock "04e443ff-f9f1-4424-847d-ea7557307ec6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-3d9e3d6a-e88a-498f-b2b3-e207ce69a436 req-cec95bda-f7b8-45eb-911a-44b7a936c404 service nova] Lock "04e443ff-f9f1-4424-847d-ea7557307ec6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-3d9e3d6a-e88a-498f-b2b3-e207ce69a436 req-cec95bda-f7b8-45eb-911a-44b7a936c404 service nova] Lock "04e443ff-f9f1-4424-847d-ea7557307ec6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:41 user nova-compute[71972]: DEBUG nova.compute.manager [req-3d9e3d6a-e88a-498f-b2b3-e207ce69a436 req-cec95bda-f7b8-45eb-911a-44b7a936c404 service nova] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] No waiting events found dispatching network-vif-plugged-1f194f69-5d98-4774-958a-1b5b81d978d3 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:11:41 user nova-compute[71972]: WARNING nova.compute.manager [req-3d9e3d6a-e88a-498f-b2b3-e207ce69a436 req-cec95bda-f7b8-45eb-911a-44b7a936c404 service nova] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Received unexpected event network-vif-plugged-1f194f69-5d98-4774-958a-1b5b81d978d3 for instance with vm_state active and task_state deleting. Apr 17 22:11:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-61950bd7-8437-47a9-bb83-0b2973735d96 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-61950bd7-8437-47a9-bb83-0b2973735d96 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:41 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:11:41 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Took 1.12 seconds to deallocate network for instance. Apr 17 22:11:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b3e4db78-f2d4-404f-9a7f-057ee77182de tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:41 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-61950bd7-8437-47a9-bb83-0b2973735d96 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:11:41 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-61950bd7-8437-47a9-bb83-0b2973735d96 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:11:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-61950bd7-8437-47a9-bb83-0b2973735d96 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.316s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b3e4db78-f2d4-404f-9a7f-057ee77182de tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.099s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:41 user nova-compute[71972]: INFO nova.scheduler.client.report [None req-61950bd7-8437-47a9-bb83-0b2973735d96 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Deleted allocations for instance 04e443ff-f9f1-4424-847d-ea7557307ec6 Apr 17 22:11:41 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:11:41 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:11:41 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Starting heal instance info cache {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 22:11:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-61950bd7-8437-47a9-bb83-0b2973735d96 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "04e443ff-f9f1-4424-847d-ea7557307ec6" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 3.032s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "refresh_cache-99cb7131-abb8-41d6-bddd-c3bc943b7678" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:11:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquired lock "refresh_cache-99cb7131-abb8-41d6-bddd-c3bc943b7678" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:11:41 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Forcefully refreshing network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 17 22:11:41 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-b3e4db78-f2d4-404f-9a7f-057ee77182de tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:11:41 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-b3e4db78-f2d4-404f-9a7f-057ee77182de tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:11:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b3e4db78-f2d4-404f-9a7f-057ee77182de tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.278s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:42 user nova-compute[71972]: INFO nova.scheduler.client.report [None req-b3e4db78-f2d4-404f-9a7f-057ee77182de tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Deleted allocations for instance 0c76babb-25fa-4e8b-9e09-c705153a95e5 Apr 17 22:11:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b3e4db78-f2d4-404f-9a7f-057ee77182de tempest-VolumesActionsTest-217042055 tempest-VolumesActionsTest-217042055-project-member] Lock "0c76babb-25fa-4e8b-9e09-c705153a95e5" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 3.423s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquiring lock "128ffde0-5149-48d2-a56e-c41418fbc753" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "128ffde0-5149-48d2-a56e-c41418fbc753" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:42 user nova-compute[71972]: DEBUG nova.compute.manager [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Starting instance... {{(pid=71972) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 22:11:42 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Updating instance_info_cache with network_info: [{"id": "aede8066-45b3-4414-98a0-50dda5a4ee66", "address": "fa:16:3e:be:3c:02", "network": {"id": "966f14e1-bfad-4b86-921c-0f8f5ad29a5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1196755439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.137", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b99089f4e3074ee7a5c1ada03ceb8984", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaede8066-45", "ovs_interfaceid": "aede8066-45b3-4414-98a0-50dda5a4ee66", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:11:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Releasing lock "refresh_cache-99cb7131-abb8-41d6-bddd-c3bc943b7678" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:11:42 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Updated the network info_cache for instance {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 17 22:11:42 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:11:42 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:11:42 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:11:42 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:11:42 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71972) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 22:11:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:42 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71972) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 22:11:42 user nova-compute[71972]: INFO nova.compute.claims [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Claim successful on node user Apr 17 22:11:42 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:11:42 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:11:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.354s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:42 user nova-compute[71972]: DEBUG nova.compute.manager [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Start building networks asynchronously for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 22:11:42 user nova-compute[71972]: DEBUG nova.compute.manager [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Allocating IP information in the background. {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 22:11:42 user nova-compute[71972]: DEBUG nova.network.neutron [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] allocate_for_instance() {{(pid=71972) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 22:11:42 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 22:11:42 user nova-compute[71972]: DEBUG nova.compute.manager [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Start building block device mappings for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 22:11:43 user nova-compute[71972]: DEBUG nova.policy [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '52b3e35c03b54ae4b5dabfb1325886a9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e52724ed9bc54905bd5eddd8504e4c77', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71972) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 22:11:43 user nova-compute[71972]: DEBUG nova.compute.manager [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Start spawning the instance on the hypervisor. {{(pid=71972) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 22:11:43 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Creating instance directory {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 22:11:43 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Creating image(s) Apr 17 22:11:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquiring lock "/opt/stack/data/nova/instances/128ffde0-5149-48d2-a56e-c41418fbc753/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "/opt/stack/data/nova/instances/128ffde0-5149-48d2-a56e-c41418fbc753/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "/opt/stack/data/nova/instances/128ffde0-5149-48d2-a56e-c41418fbc753/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:43 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:43 user nova-compute[71972]: DEBUG nova.compute.manager [req-8ff64be4-f071-459c-afea-23e1f7755a90 req-95925e16-98c1-4f91-a073-8608828864bf service nova] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Received event network-vif-deleted-4629d4b1-f472-4302-9bf6-94f62369c1c1 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:11:43 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.149s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquiring lock "fde46b8e739fd6213d1525690f3ccf27384ee720" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:43 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:43 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.136s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:43 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/128ffde0-5149-48d2-a56e-c41418fbc753/disk 1073741824 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:43 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/128ffde0-5149-48d2-a56e-c41418fbc753/disk 1073741824" returned: 0 in 0.048s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.191s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:43 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:43 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.160s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:43 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Checking if we can resize image /opt/stack/data/nova/instances/128ffde0-5149-48d2-a56e-c41418fbc753/disk. size=1073741824 {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 22:11:43 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/128ffde0-5149-48d2-a56e-c41418fbc753/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:43 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/128ffde0-5149-48d2-a56e-c41418fbc753/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:43 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Cannot resize image /opt/stack/data/nova/instances/128ffde0-5149-48d2-a56e-c41418fbc753/disk to a smaller size. {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 22:11:43 user nova-compute[71972]: DEBUG nova.objects.instance [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lazy-loading 'migration_context' on Instance uuid 128ffde0-5149-48d2-a56e-c41418fbc753 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:11:43 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Created local disks {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 22:11:43 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Ensure instance console log exists: /opt/stack/data/nova/instances/128ffde0-5149-48d2-a56e-c41418fbc753/console.log {{(pid=71972) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 22:11:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:43 user nova-compute[71972]: DEBUG nova.network.neutron [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Successfully created port: 7057b0c3-d9e3-4814-9d2d-2b70e922533b {{(pid=71972) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 22:11:44 user nova-compute[71972]: DEBUG nova.network.neutron [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Successfully updated port: 7057b0c3-d9e3-4814-9d2d-2b70e922533b {{(pid=71972) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 22:11:44 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquiring lock "refresh_cache-128ffde0-5149-48d2-a56e-c41418fbc753" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:11:44 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquired lock "refresh_cache-128ffde0-5149-48d2-a56e-c41418fbc753" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:11:44 user nova-compute[71972]: DEBUG nova.network.neutron [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Building network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 22:11:44 user nova-compute[71972]: DEBUG nova.network.neutron [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Instance cache missing network info. {{(pid=71972) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.compute.manager [req-db18c8d1-afb6-490e-bef2-f0a344fb50ef req-367a93d8-d112-4d48-a19c-431acbc55719 service nova] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Received event network-changed-54d6ec48-0412-4678-9745-e657a446347d {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.compute.manager [req-db18c8d1-afb6-490e-bef2-f0a344fb50ef req-367a93d8-d112-4d48-a19c-431acbc55719 service nova] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Refreshing instance network info cache due to event network-changed-54d6ec48-0412-4678-9745-e657a446347d. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-db18c8d1-afb6-490e-bef2-f0a344fb50ef req-367a93d8-d112-4d48-a19c-431acbc55719 service nova] Acquiring lock "refresh_cache-d9386728-7c3f-42ff-8f1c-51748ccefff3" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-db18c8d1-afb6-490e-bef2-f0a344fb50ef req-367a93d8-d112-4d48-a19c-431acbc55719 service nova] Acquired lock "refresh_cache-d9386728-7c3f-42ff-8f1c-51748ccefff3" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.network.neutron [req-db18c8d1-afb6-490e-bef2-f0a344fb50ef req-367a93d8-d112-4d48-a19c-431acbc55719 service nova] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Refreshing network info cache for port 54d6ec48-0412-4678-9745-e657a446347d {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.network.neutron [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Updating instance_info_cache with network_info: [{"id": "7057b0c3-d9e3-4814-9d2d-2b70e922533b", "address": "fa:16:3e:7c:ba:7a", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7057b0c3-d9", "ovs_interfaceid": "7057b0c3-d9e3-4814-9d2d-2b70e922533b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.compute.manager [req-31547e98-0285-47a1-aa68-bcbcf64688d4 req-3ebdeb29-79b5-41d4-856b-fcbc8e750ec2 service nova] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Received event network-changed-7057b0c3-d9e3-4814-9d2d-2b70e922533b {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.compute.manager [req-31547e98-0285-47a1-aa68-bcbcf64688d4 req-3ebdeb29-79b5-41d4-856b-fcbc8e750ec2 service nova] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Refreshing instance network info cache due to event network-changed-7057b0c3-d9e3-4814-9d2d-2b70e922533b. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-31547e98-0285-47a1-aa68-bcbcf64688d4 req-3ebdeb29-79b5-41d4-856b-fcbc8e750ec2 service nova] Acquiring lock "refresh_cache-128ffde0-5149-48d2-a56e-c41418fbc753" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Releasing lock "refresh_cache-128ffde0-5149-48d2-a56e-c41418fbc753" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.compute.manager [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Instance network_info: |[{"id": "7057b0c3-d9e3-4814-9d2d-2b70e922533b", "address": "fa:16:3e:7c:ba:7a", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7057b0c3-d9", "ovs_interfaceid": "7057b0c3-d9e3-4814-9d2d-2b70e922533b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-31547e98-0285-47a1-aa68-bcbcf64688d4 req-3ebdeb29-79b5-41d4-856b-fcbc8e750ec2 service nova] Acquired lock "refresh_cache-128ffde0-5149-48d2-a56e-c41418fbc753" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.network.neutron [req-31547e98-0285-47a1-aa68-bcbcf64688d4 req-3ebdeb29-79b5-41d4-856b-fcbc8e750ec2 service nova] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Refreshing network info cache for port 7057b0c3-d9e3-4814-9d2d-2b70e922533b {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Start _get_guest_xml network_info=[{"id": "7057b0c3-d9e3-4814-9d2d-2b70e922533b", "address": "fa:16:3e:7c:ba:7a", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7057b0c3-d9", "ovs_interfaceid": "7057b0c3-d9e3-4814-9d2d-2b70e922533b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'image_id': '80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 22:11:45 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:11:45 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71972) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T22:06:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=), allow threads: True {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Flavor limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Image limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Flavor pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Image pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Got 1 possible topologies {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:11:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-494563557',display_name='tempest-AttachVolumeNegativeTest-server-494563557',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-494563557',id=13,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCCM4zdhbyTNqZazXlr1WLqme2C0HFJDjVxfYGQZTlRtIWULGYITIT3EQ1q/6k7xhD3mm3oc0QfKdmb3gJ1SjZzP4d00vGPpTXvENVjyWcQGVsW5qXE+WwAnTojbreOxhA==',key_name='tempest-keypair-682999879',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e52724ed9bc54905bd5eddd8504e4c77',ramdisk_id='',reservation_id='r-dzjheaur',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-1678965362',owner_user_name='tempest-AttachVolumeNegativeTest-1678965362-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:11:43Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='52b3e35c03b54ae4b5dabfb1325886a9',uuid=128ffde0-5149-48d2-a56e-c41418fbc753,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7057b0c3-d9e3-4814-9d2d-2b70e922533b", "address": "fa:16:3e:7c:ba:7a", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7057b0c3-d9", "ovs_interfaceid": "7057b0c3-d9e3-4814-9d2d-2b70e922533b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71972) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Converting VIF {"id": "7057b0c3-d9e3-4814-9d2d-2b70e922533b", "address": "fa:16:3e:7c:ba:7a", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7057b0c3-d9", "ovs_interfaceid": "7057b0c3-d9e3-4814-9d2d-2b70e922533b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:ba:7a,bridge_name='br-int',has_traffic_filtering=True,id=7057b0c3-d9e3-4814-9d2d-2b70e922533b,network=Network(bd9d53b7-0e22-42c4-8b91-49575cadf44f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7057b0c3-d9') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.objects.instance [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lazy-loading 'pci_devices' on Instance uuid 128ffde0-5149-48d2-a56e-c41418fbc753 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] End _get_guest_xml xml= Apr 17 22:11:45 user nova-compute[71972]: 128ffde0-5149-48d2-a56e-c41418fbc753 Apr 17 22:11:45 user nova-compute[71972]: instance-0000000d Apr 17 22:11:45 user nova-compute[71972]: 131072 Apr 17 22:11:45 user nova-compute[71972]: 1 Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: tempest-AttachVolumeNegativeTest-server-494563557 Apr 17 22:11:45 user nova-compute[71972]: 2023-04-17 22:11:45 Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: 128 Apr 17 22:11:45 user nova-compute[71972]: 1 Apr 17 22:11:45 user nova-compute[71972]: 0 Apr 17 22:11:45 user nova-compute[71972]: 0 Apr 17 22:11:45 user nova-compute[71972]: 1 Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: tempest-AttachVolumeNegativeTest-1678965362-project-member Apr 17 22:11:45 user nova-compute[71972]: tempest-AttachVolumeNegativeTest-1678965362 Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: OpenStack Foundation Apr 17 22:11:45 user nova-compute[71972]: OpenStack Nova Apr 17 22:11:45 user nova-compute[71972]: 0.0.0 Apr 17 22:11:45 user nova-compute[71972]: 128ffde0-5149-48d2-a56e-c41418fbc753 Apr 17 22:11:45 user nova-compute[71972]: 128ffde0-5149-48d2-a56e-c41418fbc753 Apr 17 22:11:45 user nova-compute[71972]: Virtual Machine Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: hvm Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Nehalem Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: /dev/urandom Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: Apr 17 22:11:45 user nova-compute[71972]: {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:11:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-494563557',display_name='tempest-AttachVolumeNegativeTest-server-494563557',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-494563557',id=13,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCCM4zdhbyTNqZazXlr1WLqme2C0HFJDjVxfYGQZTlRtIWULGYITIT3EQ1q/6k7xhD3mm3oc0QfKdmb3gJ1SjZzP4d00vGPpTXvENVjyWcQGVsW5qXE+WwAnTojbreOxhA==',key_name='tempest-keypair-682999879',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e52724ed9bc54905bd5eddd8504e4c77',ramdisk_id='',reservation_id='r-dzjheaur',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-1678965362',owner_user_name='tempest-AttachVolumeNegativeTest-1678965362-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:11:43Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='52b3e35c03b54ae4b5dabfb1325886a9',uuid=128ffde0-5149-48d2-a56e-c41418fbc753,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7057b0c3-d9e3-4814-9d2d-2b70e922533b", "address": "fa:16:3e:7c:ba:7a", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7057b0c3-d9", "ovs_interfaceid": "7057b0c3-d9e3-4814-9d2d-2b70e922533b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Converting VIF {"id": "7057b0c3-d9e3-4814-9d2d-2b70e922533b", "address": "fa:16:3e:7c:ba:7a", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7057b0c3-d9", "ovs_interfaceid": "7057b0c3-d9e3-4814-9d2d-2b70e922533b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:ba:7a,bridge_name='br-int',has_traffic_filtering=True,id=7057b0c3-d9e3-4814-9d2d-2b70e922533b,network=Network(bd9d53b7-0e22-42c4-8b91-49575cadf44f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7057b0c3-d9') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG os_vif [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:ba:7a,bridge_name='br-int',has_traffic_filtering=True,id=7057b0c3-d9e3-4814-9d2d-2b70e922533b,network=Network(bd9d53b7-0e22-42c4-8b91-49575cadf44f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7057b0c3-d9') {{(pid=71972) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7057b0c3-d9, may_exist=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7057b0c3-d9, col_values=(('external_ids', {'iface-id': '7057b0c3-d9e3-4814-9d2d-2b70e922533b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7c:ba:7a', 'vm-uuid': '128ffde0-5149-48d2-a56e-c41418fbc753'}),)) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:45 user nova-compute[71972]: INFO os_vif [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:ba:7a,bridge_name='br-int',has_traffic_filtering=True,id=7057b0c3-d9e3-4814-9d2d-2b70e922533b,network=Network(bd9d53b7-0e22-42c4-8b91-49575cadf44f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7057b0c3-d9') Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] No BDM found with device name vda, not building metadata. {{(pid=71972) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] No VIF found with MAC fa:16:3e:7c:ba:7a, not building metadata {{(pid=71972) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.network.neutron [req-db18c8d1-afb6-490e-bef2-f0a344fb50ef req-367a93d8-d112-4d48-a19c-431acbc55719 service nova] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Updated VIF entry in instance network info cache for port 54d6ec48-0412-4678-9745-e657a446347d. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG nova.network.neutron [req-db18c8d1-afb6-490e-bef2-f0a344fb50ef req-367a93d8-d112-4d48-a19c-431acbc55719 service nova] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Updating instance_info_cache with network_info: [{"id": "54d6ec48-0412-4678-9745-e657a446347d", "address": "fa:16:3e:c6:26:04", "network": {"id": "83a63ee2-d2cb-414d-bab9-c556ee1c2c88", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-625420379-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "416fcd7cd2bc486884f751acab268fd8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d6ec48-04", "ovs_interfaceid": "54d6ec48-0412-4678-9745-e657a446347d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:11:45 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-db18c8d1-afb6-490e-bef2-f0a344fb50ef req-367a93d8-d112-4d48-a19c-431acbc55719 service nova] Releasing lock "refresh_cache-d9386728-7c3f-42ff-8f1c-51748ccefff3" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:11:46 user nova-compute[71972]: DEBUG nova.network.neutron [req-31547e98-0285-47a1-aa68-bcbcf64688d4 req-3ebdeb29-79b5-41d4-856b-fcbc8e750ec2 service nova] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Updated VIF entry in instance network info cache for port 7057b0c3-d9e3-4814-9d2d-2b70e922533b. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:11:46 user nova-compute[71972]: DEBUG nova.network.neutron [req-31547e98-0285-47a1-aa68-bcbcf64688d4 req-3ebdeb29-79b5-41d4-856b-fcbc8e750ec2 service nova] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Updating instance_info_cache with network_info: [{"id": "7057b0c3-d9e3-4814-9d2d-2b70e922533b", "address": "fa:16:3e:7c:ba:7a", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7057b0c3-d9", "ovs_interfaceid": "7057b0c3-d9e3-4814-9d2d-2b70e922533b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:11:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-31547e98-0285-47a1-aa68-bcbcf64688d4 req-3ebdeb29-79b5-41d4-856b-fcbc8e750ec2 service nova] Releasing lock "refresh_cache-128ffde0-5149-48d2-a56e-c41418fbc753" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:11:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Acquiring lock "e4e13341-82c2-4b86-8b5a-e12d435513ee" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "e4e13341-82c2-4b86-8b5a-e12d435513ee" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:47 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Starting instance... {{(pid=71972) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 22:11:47 user nova-compute[71972]: DEBUG nova.compute.manager [req-b0b3ce9e-9745-42cc-90bd-9ca22009b39a req-89acf438-ebb2-4d80-938b-096c7b3075fb service nova] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Received event network-vif-plugged-7057b0c3-d9e3-4814-9d2d-2b70e922533b {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:11:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-b0b3ce9e-9745-42cc-90bd-9ca22009b39a req-89acf438-ebb2-4d80-938b-096c7b3075fb service nova] Acquiring lock "128ffde0-5149-48d2-a56e-c41418fbc753-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-b0b3ce9e-9745-42cc-90bd-9ca22009b39a req-89acf438-ebb2-4d80-938b-096c7b3075fb service nova] Lock "128ffde0-5149-48d2-a56e-c41418fbc753-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-b0b3ce9e-9745-42cc-90bd-9ca22009b39a req-89acf438-ebb2-4d80-938b-096c7b3075fb service nova] Lock "128ffde0-5149-48d2-a56e-c41418fbc753-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:47 user nova-compute[71972]: DEBUG nova.compute.manager [req-b0b3ce9e-9745-42cc-90bd-9ca22009b39a req-89acf438-ebb2-4d80-938b-096c7b3075fb service nova] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] No waiting events found dispatching network-vif-plugged-7057b0c3-d9e3-4814-9d2d-2b70e922533b {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:11:47 user nova-compute[71972]: WARNING nova.compute.manager [req-b0b3ce9e-9745-42cc-90bd-9ca22009b39a req-89acf438-ebb2-4d80-938b-096c7b3075fb service nova] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Received unexpected event network-vif-plugged-7057b0c3-d9e3-4814-9d2d-2b70e922533b for instance with vm_state building and task_state spawning. Apr 17 22:11:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:47 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71972) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 22:11:47 user nova-compute[71972]: INFO nova.compute.claims [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Claim successful on node user Apr 17 22:11:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:47 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:11:47 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.371s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Start building networks asynchronously for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Allocating IP information in the background. {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] allocate_for_instance() {{(pid=71972) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 22:11:48 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 22:11:48 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Start building block device mappings for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG nova.policy [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '70af0dc4dbf24ae1add76f3c87f8b1b5', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '416fcd7cd2bc486884f751acab268fd8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71972) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Start spawning the instance on the hypervisor. {{(pid=71972) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Creating instance directory {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 22:11:48 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Creating image(s) Apr 17 22:11:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Acquiring lock "/opt/stack/data/nova/instances/e4e13341-82c2-4b86-8b5a-e12d435513ee/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "/opt/stack/data/nova/instances/e4e13341-82c2-4b86-8b5a-e12d435513ee/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "/opt/stack/data/nova/instances/e4e13341-82c2-4b86-8b5a-e12d435513ee/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.144s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Acquiring lock "fde46b8e739fd6213d1525690f3ccf27384ee720" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.142s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/e4e13341-82c2-4b86-8b5a-e12d435513ee/disk 1073741824 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/e4e13341-82c2-4b86-8b5a-e12d435513ee/disk 1073741824" returned: 0 in 0.051s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.202s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.143s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Checking if we can resize image /opt/stack/data/nova/instances/e4e13341-82c2-4b86-8b5a-e12d435513ee/disk. size=1073741824 {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e4e13341-82c2-4b86-8b5a-e12d435513ee/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e4e13341-82c2-4b86-8b5a-e12d435513ee/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Cannot resize image /opt/stack/data/nova/instances/e4e13341-82c2-4b86-8b5a-e12d435513ee/disk to a smaller size. {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG nova.objects.instance [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lazy-loading 'migration_context' on Instance uuid e4e13341-82c2-4b86-8b5a-e12d435513ee {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Resumed> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:11:48 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] VM Resumed (Lifecycle Event) Apr 17 22:11:48 user nova-compute[71972]: DEBUG nova.compute.manager [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Instance event wait completed in 0 seconds for {{(pid=71972) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Guest created on hypervisor {{(pid=71972) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Created local disks {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Ensure instance console log exists: /opt/stack/data/nova/instances/e4e13341-82c2-4b86-8b5a-e12d435513ee/console.log {{(pid=71972) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:11:48 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Instance spawned successfully. Apr 17 22:11:48 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Found default for hw_cdrom_bus of ide {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Found default for hw_disk_bus of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Found default for hw_input_bus of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Found default for hw_pointer_model of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Found default for hw_video_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Found default for hw_vif_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:11:48 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:11:48 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Started> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:11:48 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] VM Started (Lifecycle Event) Apr 17 22:11:48 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:11:48 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:11:49 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:11:49 user nova-compute[71972]: INFO nova.compute.manager [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Took 5.99 seconds to spawn the instance on the hypervisor. Apr 17 22:11:49 user nova-compute[71972]: DEBUG nova.compute.manager [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:11:49 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Successfully created port: ea831325-55ac-45ca-ab26-4b424e66ca77 {{(pid=71972) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 22:11:49 user nova-compute[71972]: INFO nova.compute.manager [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Took 6.67 seconds to build instance. Apr 17 22:11:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3a9c4025-7035-4a6a-941a-a81314b86046 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "128ffde0-5149-48d2-a56e-c41418fbc753" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.776s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:49 user nova-compute[71972]: DEBUG nova.compute.manager [req-64142654-d301-430e-838b-bae7e6046ce1 req-1cf30b8f-966b-4c16-847f-a4c719113cae service nova] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Received event network-vif-plugged-7057b0c3-d9e3-4814-9d2d-2b70e922533b {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:11:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-64142654-d301-430e-838b-bae7e6046ce1 req-1cf30b8f-966b-4c16-847f-a4c719113cae service nova] Acquiring lock "128ffde0-5149-48d2-a56e-c41418fbc753-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-64142654-d301-430e-838b-bae7e6046ce1 req-1cf30b8f-966b-4c16-847f-a4c719113cae service nova] Lock "128ffde0-5149-48d2-a56e-c41418fbc753-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-64142654-d301-430e-838b-bae7e6046ce1 req-1cf30b8f-966b-4c16-847f-a4c719113cae service nova] Lock "128ffde0-5149-48d2-a56e-c41418fbc753-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:49 user nova-compute[71972]: DEBUG nova.compute.manager [req-64142654-d301-430e-838b-bae7e6046ce1 req-1cf30b8f-966b-4c16-847f-a4c719113cae service nova] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] No waiting events found dispatching network-vif-plugged-7057b0c3-d9e3-4814-9d2d-2b70e922533b {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:11:49 user nova-compute[71972]: WARNING nova.compute.manager [req-64142654-d301-430e-838b-bae7e6046ce1 req-1cf30b8f-966b-4c16-847f-a4c719113cae service nova] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Received unexpected event network-vif-plugged-7057b0c3-d9e3-4814-9d2d-2b70e922533b for instance with vm_state active and task_state None. Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Successfully updated port: ea831325-55ac-45ca-ab26-4b424e66ca77 {{(pid=71972) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.compute.manager [req-dbe9bef2-9995-4a38-8885-27a5102cb9b5 req-b3c349f0-99b5-4286-ae54-55a94775b679 service nova] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Received event network-changed-ea831325-55ac-45ca-ab26-4b424e66ca77 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.compute.manager [req-dbe9bef2-9995-4a38-8885-27a5102cb9b5 req-b3c349f0-99b5-4286-ae54-55a94775b679 service nova] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Refreshing instance network info cache due to event network-changed-ea831325-55ac-45ca-ab26-4b424e66ca77. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-dbe9bef2-9995-4a38-8885-27a5102cb9b5 req-b3c349f0-99b5-4286-ae54-55a94775b679 service nova] Acquiring lock "refresh_cache-e4e13341-82c2-4b86-8b5a-e12d435513ee" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-dbe9bef2-9995-4a38-8885-27a5102cb9b5 req-b3c349f0-99b5-4286-ae54-55a94775b679 service nova] Acquired lock "refresh_cache-e4e13341-82c2-4b86-8b5a-e12d435513ee" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.network.neutron [req-dbe9bef2-9995-4a38-8885-27a5102cb9b5 req-b3c349f0-99b5-4286-ae54-55a94775b679 service nova] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Refreshing network info cache for port ea831325-55ac-45ca-ab26-4b424e66ca77 {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Acquiring lock "refresh_cache-e4e13341-82c2-4b86-8b5a-e12d435513ee" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.network.neutron [req-dbe9bef2-9995-4a38-8885-27a5102cb9b5 req-b3c349f0-99b5-4286-ae54-55a94775b679 service nova] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Instance cache missing network info. {{(pid=71972) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Acquiring lock "2b53a15a-2e55-4c9e-976b-addb176545fa" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "2b53a15a-2e55-4c9e-976b-addb176545fa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.compute.manager [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Starting instance... {{(pid=71972) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71972) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 22:11:50 user nova-compute[71972]: INFO nova.compute.claims [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Claim successful on node user Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.network.neutron [req-dbe9bef2-9995-4a38-8885-27a5102cb9b5 req-b3c349f0-99b5-4286-ae54-55a94775b679 service nova] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-dbe9bef2-9995-4a38-8885-27a5102cb9b5 req-b3c349f0-99b5-4286-ae54-55a94775b679 service nova] Releasing lock "refresh_cache-e4e13341-82c2-4b86-8b5a-e12d435513ee" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Acquired lock "refresh_cache-e4e13341-82c2-4b86-8b5a-e12d435513ee" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Building network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Instance cache missing network info. {{(pid=71972) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.397s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.compute.manager [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Start building networks asynchronously for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.compute.manager [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Allocating IP information in the background. {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.network.neutron [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] allocate_for_instance() {{(pid=71972) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 22:11:50 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.compute.manager [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Start building block device mappings for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Updating instance_info_cache with network_info: [{"id": "ea831325-55ac-45ca-ab26-4b424e66ca77", "address": "fa:16:3e:c5:43:1e", "network": {"id": "83a63ee2-d2cb-414d-bab9-c556ee1c2c88", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-625420379-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "416fcd7cd2bc486884f751acab268fd8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapea831325-55", "ovs_interfaceid": "ea831325-55ac-45ca-ab26-4b424e66ca77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Releasing lock "refresh_cache-e4e13341-82c2-4b86-8b5a-e12d435513ee" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Instance network_info: |[{"id": "ea831325-55ac-45ca-ab26-4b424e66ca77", "address": "fa:16:3e:c5:43:1e", "network": {"id": "83a63ee2-d2cb-414d-bab9-c556ee1c2c88", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-625420379-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "416fcd7cd2bc486884f751acab268fd8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapea831325-55", "ovs_interfaceid": "ea831325-55ac-45ca-ab26-4b424e66ca77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Start _get_guest_xml network_info=[{"id": "ea831325-55ac-45ca-ab26-4b424e66ca77", "address": "fa:16:3e:c5:43:1e", "network": {"id": "83a63ee2-d2cb-414d-bab9-c556ee1c2c88", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-625420379-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "416fcd7cd2bc486884f751acab268fd8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapea831325-55", "ovs_interfaceid": "ea831325-55ac-45ca-ab26-4b424e66ca77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'image_id': '80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 22:11:50 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:11:50 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71972) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T22:06:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=), allow threads: True {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Flavor limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Image limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Flavor pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Image pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Got 1 possible topologies {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 22:11:50 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:11:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1560519322',display_name='tempest-VolumesAdminNegativeTest-server-1560519322',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-1560519322',id=14,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='416fcd7cd2bc486884f751acab268fd8',ramdisk_id='',reservation_id='r-eu7szm24',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-315022411',owner_user_name='tempest-VolumesAdminNegativeTest-315022411-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:11:48Z,user_data=None,user_id='70af0dc4dbf24ae1add76f3c87f8b1b5',uuid=e4e13341-82c2-4b86-8b5a-e12d435513ee,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ea831325-55ac-45ca-ab26-4b424e66ca77", "address": "fa:16:3e:c5:43:1e", "network": {"id": "83a63ee2-d2cb-414d-bab9-c556ee1c2c88", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-625420379-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "416fcd7cd2bc486884f751acab268fd8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapea831325-55", "ovs_interfaceid": "ea831325-55ac-45ca-ab26-4b424e66ca77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71972) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Converting VIF {"id": "ea831325-55ac-45ca-ab26-4b424e66ca77", "address": "fa:16:3e:c5:43:1e", "network": {"id": "83a63ee2-d2cb-414d-bab9-c556ee1c2c88", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-625420379-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "416fcd7cd2bc486884f751acab268fd8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapea831325-55", "ovs_interfaceid": "ea831325-55ac-45ca-ab26-4b424e66ca77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c5:43:1e,bridge_name='br-int',has_traffic_filtering=True,id=ea831325-55ac-45ca-ab26-4b424e66ca77,network=Network(83a63ee2-d2cb-414d-bab9-c556ee1c2c88),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea831325-55') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG nova.objects.instance [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lazy-loading 'pci_devices' on Instance uuid e4e13341-82c2-4b86-8b5a-e12d435513ee {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] End _get_guest_xml xml= Apr 17 22:11:51 user nova-compute[71972]: e4e13341-82c2-4b86-8b5a-e12d435513ee Apr 17 22:11:51 user nova-compute[71972]: instance-0000000e Apr 17 22:11:51 user nova-compute[71972]: 131072 Apr 17 22:11:51 user nova-compute[71972]: 1 Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: tempest-VolumesAdminNegativeTest-server-1560519322 Apr 17 22:11:51 user nova-compute[71972]: 2023-04-17 22:11:50 Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: 128 Apr 17 22:11:51 user nova-compute[71972]: 1 Apr 17 22:11:51 user nova-compute[71972]: 0 Apr 17 22:11:51 user nova-compute[71972]: 0 Apr 17 22:11:51 user nova-compute[71972]: 1 Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: tempest-VolumesAdminNegativeTest-315022411-project-member Apr 17 22:11:51 user nova-compute[71972]: tempest-VolumesAdminNegativeTest-315022411 Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: OpenStack Foundation Apr 17 22:11:51 user nova-compute[71972]: OpenStack Nova Apr 17 22:11:51 user nova-compute[71972]: 0.0.0 Apr 17 22:11:51 user nova-compute[71972]: e4e13341-82c2-4b86-8b5a-e12d435513ee Apr 17 22:11:51 user nova-compute[71972]: e4e13341-82c2-4b86-8b5a-e12d435513ee Apr 17 22:11:51 user nova-compute[71972]: Virtual Machine Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: hvm Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Nehalem Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: /dev/urandom Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: Apr 17 22:11:51 user nova-compute[71972]: {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:11:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1560519322',display_name='tempest-VolumesAdminNegativeTest-server-1560519322',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-1560519322',id=14,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='416fcd7cd2bc486884f751acab268fd8',ramdisk_id='',reservation_id='r-eu7szm24',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-315022411',owner_user_name='tempest-VolumesAdminNegativeTest-315022411-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:11:48Z,user_data=None,user_id='70af0dc4dbf24ae1add76f3c87f8b1b5',uuid=e4e13341-82c2-4b86-8b5a-e12d435513ee,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ea831325-55ac-45ca-ab26-4b424e66ca77", "address": "fa:16:3e:c5:43:1e", "network": {"id": "83a63ee2-d2cb-414d-bab9-c556ee1c2c88", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-625420379-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "416fcd7cd2bc486884f751acab268fd8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapea831325-55", "ovs_interfaceid": "ea831325-55ac-45ca-ab26-4b424e66ca77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Converting VIF {"id": "ea831325-55ac-45ca-ab26-4b424e66ca77", "address": "fa:16:3e:c5:43:1e", "network": {"id": "83a63ee2-d2cb-414d-bab9-c556ee1c2c88", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-625420379-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "416fcd7cd2bc486884f751acab268fd8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapea831325-55", "ovs_interfaceid": "ea831325-55ac-45ca-ab26-4b424e66ca77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c5:43:1e,bridge_name='br-int',has_traffic_filtering=True,id=ea831325-55ac-45ca-ab26-4b424e66ca77,network=Network(83a63ee2-d2cb-414d-bab9-c556ee1c2c88),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea831325-55') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG os_vif [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:43:1e,bridge_name='br-int',has_traffic_filtering=True,id=ea831325-55ac-45ca-ab26-4b424e66ca77,network=Network(83a63ee2-d2cb-414d-bab9-c556ee1c2c88),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea831325-55') {{(pid=71972) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapea831325-55, may_exist=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapea831325-55, col_values=(('external_ids', {'iface-id': 'ea831325-55ac-45ca-ab26-4b424e66ca77', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c5:43:1e', 'vm-uuid': 'e4e13341-82c2-4b86-8b5a-e12d435513ee'}),)) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:51 user nova-compute[71972]: INFO os_vif [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:43:1e,bridge_name='br-int',has_traffic_filtering=True,id=ea831325-55ac-45ca-ab26-4b424e66ca77,network=Network(83a63ee2-d2cb-414d-bab9-c556ee1c2c88),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea831325-55') Apr 17 22:11:51 user nova-compute[71972]: DEBUG nova.compute.manager [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Start spawning the instance on the hypervisor. {{(pid=71972) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Creating instance directory {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 22:11:51 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Creating image(s) Apr 17 22:11:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Acquiring lock "/opt/stack/data/nova/instances/2b53a15a-2e55-4c9e-976b-addb176545fa/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "/opt/stack/data/nova/instances/2b53a15a-2e55-4c9e-976b-addb176545fa/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "/opt/stack/data/nova/instances/2b53a15a-2e55-4c9e-976b-addb176545fa/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG nova.policy [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a5ec05fe7d7244368c7eec3739a96c19', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9283fe3c9a094f9bbddb08e48973da44', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71972) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] No BDM found with device name vda, not building metadata. {{(pid=71972) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] No VIF found with MAC fa:16:3e:c5:43:1e, not building metadata {{(pid=71972) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.131s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Acquiring lock "fde46b8e739fd6213d1525690f3ccf27384ee720" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.140s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/2b53a15a-2e55-4c9e-976b-addb176545fa/disk 1073741824 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/2b53a15a-2e55-4c9e-976b-addb176545fa/disk 1073741824" returned: 0 in 0.047s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.192s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.162s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Checking if we can resize image /opt/stack/data/nova/instances/2b53a15a-2e55-4c9e-976b-addb176545fa/disk. size=1073741824 {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2b53a15a-2e55-4c9e-976b-addb176545fa/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2b53a15a-2e55-4c9e-976b-addb176545fa/disk --force-share --output=json" returned: 0 in 0.166s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Cannot resize image /opt/stack/data/nova/instances/2b53a15a-2e55-4c9e-976b-addb176545fa/disk to a smaller size. {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG nova.objects.instance [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lazy-loading 'migration_context' on Instance uuid 2b53a15a-2e55-4c9e-976b-addb176545fa {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Created local disks {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Ensure instance console log exists: /opt/stack/data/nova/instances/2b53a15a-2e55-4c9e-976b-addb176545fa/console.log {{(pid=71972) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:52 user nova-compute[71972]: DEBUG nova.network.neutron [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Successfully created port: 2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7 {{(pid=71972) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 22:11:52 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:52 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:52 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:52 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:52 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:52 user nova-compute[71972]: DEBUG nova.network.neutron [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Successfully updated port: 2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7 {{(pid=71972) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 22:11:52 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Acquiring lock "refresh_cache-2b53a15a-2e55-4c9e-976b-addb176545fa" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:11:52 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Acquired lock "refresh_cache-2b53a15a-2e55-4c9e-976b-addb176545fa" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:11:52 user nova-compute[71972]: DEBUG nova.network.neutron [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Building network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 22:11:52 user nova-compute[71972]: DEBUG nova.compute.manager [req-b91fcdd1-5e07-4460-899a-30137ad15acb req-18062dce-9ac9-4b6b-b278-0b534ce2b97f service nova] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Received event network-changed-2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:11:52 user nova-compute[71972]: DEBUG nova.compute.manager [req-b91fcdd1-5e07-4460-899a-30137ad15acb req-18062dce-9ac9-4b6b-b278-0b534ce2b97f service nova] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Refreshing instance network info cache due to event network-changed-2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:11:52 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-b91fcdd1-5e07-4460-899a-30137ad15acb req-18062dce-9ac9-4b6b-b278-0b534ce2b97f service nova] Acquiring lock "refresh_cache-2b53a15a-2e55-4c9e-976b-addb176545fa" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:11:52 user nova-compute[71972]: DEBUG nova.network.neutron [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Instance cache missing network info. {{(pid=71972) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.compute.manager [req-bdfe52a9-4d14-47eb-bc79-b6875db961ce req-15beb07f-3304-4712-8eec-0bcce04a68f4 service nova] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Received event network-vif-plugged-ea831325-55ac-45ca-ab26-4b424e66ca77 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-bdfe52a9-4d14-47eb-bc79-b6875db961ce req-15beb07f-3304-4712-8eec-0bcce04a68f4 service nova] Acquiring lock "e4e13341-82c2-4b86-8b5a-e12d435513ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-bdfe52a9-4d14-47eb-bc79-b6875db961ce req-15beb07f-3304-4712-8eec-0bcce04a68f4 service nova] Lock "e4e13341-82c2-4b86-8b5a-e12d435513ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-bdfe52a9-4d14-47eb-bc79-b6875db961ce req-15beb07f-3304-4712-8eec-0bcce04a68f4 service nova] Lock "e4e13341-82c2-4b86-8b5a-e12d435513ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.compute.manager [req-bdfe52a9-4d14-47eb-bc79-b6875db961ce req-15beb07f-3304-4712-8eec-0bcce04a68f4 service nova] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] No waiting events found dispatching network-vif-plugged-ea831325-55ac-45ca-ab26-4b424e66ca77 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:11:53 user nova-compute[71972]: WARNING nova.compute.manager [req-bdfe52a9-4d14-47eb-bc79-b6875db961ce req-15beb07f-3304-4712-8eec-0bcce04a68f4 service nova] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Received unexpected event network-vif-plugged-ea831325-55ac-45ca-ab26-4b424e66ca77 for instance with vm_state building and task_state spawning. Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.network.neutron [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Updating instance_info_cache with network_info: [{"id": "2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7", "address": "fa:16:3e:0a:f9:e7", "network": {"id": "2aae2552-3ef8-41d3-84e6-313da6fc203b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1626967335-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9283fe3c9a094f9bbddb08e48973da44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2515c5ff-fa", "ovs_interfaceid": "2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Releasing lock "refresh_cache-2b53a15a-2e55-4c9e-976b-addb176545fa" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.compute.manager [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Instance network_info: |[{"id": "2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7", "address": "fa:16:3e:0a:f9:e7", "network": {"id": "2aae2552-3ef8-41d3-84e6-313da6fc203b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1626967335-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9283fe3c9a094f9bbddb08e48973da44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2515c5ff-fa", "ovs_interfaceid": "2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-b91fcdd1-5e07-4460-899a-30137ad15acb req-18062dce-9ac9-4b6b-b278-0b534ce2b97f service nova] Acquired lock "refresh_cache-2b53a15a-2e55-4c9e-976b-addb176545fa" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.network.neutron [req-b91fcdd1-5e07-4460-899a-30137ad15acb req-18062dce-9ac9-4b6b-b278-0b534ce2b97f service nova] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Refreshing network info cache for port 2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7 {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Start _get_guest_xml network_info=[{"id": "2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7", "address": "fa:16:3e:0a:f9:e7", "network": {"id": "2aae2552-3ef8-41d3-84e6-313da6fc203b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1626967335-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9283fe3c9a094f9bbddb08e48973da44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2515c5ff-fa", "ovs_interfaceid": "2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'image_id': '80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 22:11:53 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:11:53 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71972) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T22:06:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=), allow threads: True {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Flavor limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Image limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Flavor pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Image pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Got 1 possible topologies {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:11:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-742045578',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-742045578',id=15,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPQR3wOyHdbetUVTI8L6ivA4S33oxQgPScnR6ThbG577rtQzESbVwvoSv8WcOg2mDZIkUxurn2f81Gs5LsSgI5VUppArPBavq2+Gv6ZDUuSwGAQdnLviswAseye+/hTX7A==',key_name='tempest-keypair-41502610',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9283fe3c9a094f9bbddb08e48973da44',ramdisk_id='',reservation_id='r-a0ylbpui',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1051644628',owner_user_name='tempest-AttachVolumeShelveTestJSON-1051644628-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:11:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a5ec05fe7d7244368c7eec3739a96c19',uuid=2b53a15a-2e55-4c9e-976b-addb176545fa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7", "address": "fa:16:3e:0a:f9:e7", "network": {"id": "2aae2552-3ef8-41d3-84e6-313da6fc203b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1626967335-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9283fe3c9a094f9bbddb08e48973da44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2515c5ff-fa", "ovs_interfaceid": "2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71972) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Converting VIF {"id": "2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7", "address": "fa:16:3e:0a:f9:e7", "network": {"id": "2aae2552-3ef8-41d3-84e6-313da6fc203b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1626967335-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9283fe3c9a094f9bbddb08e48973da44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2515c5ff-fa", "ovs_interfaceid": "2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:f9:e7,bridge_name='br-int',has_traffic_filtering=True,id=2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7,network=Network(2aae2552-3ef8-41d3-84e6-313da6fc203b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2515c5ff-fa') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.objects.instance [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lazy-loading 'pci_devices' on Instance uuid 2b53a15a-2e55-4c9e-976b-addb176545fa {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] End _get_guest_xml xml= Apr 17 22:11:53 user nova-compute[71972]: 2b53a15a-2e55-4c9e-976b-addb176545fa Apr 17 22:11:53 user nova-compute[71972]: instance-0000000f Apr 17 22:11:53 user nova-compute[71972]: 131072 Apr 17 22:11:53 user nova-compute[71972]: 1 Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: tempest-AttachVolumeShelveTestJSON-server-742045578 Apr 17 22:11:53 user nova-compute[71972]: 2023-04-17 22:11:53 Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: 128 Apr 17 22:11:53 user nova-compute[71972]: 1 Apr 17 22:11:53 user nova-compute[71972]: 0 Apr 17 22:11:53 user nova-compute[71972]: 0 Apr 17 22:11:53 user nova-compute[71972]: 1 Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: tempest-AttachVolumeShelveTestJSON-1051644628-project-member Apr 17 22:11:53 user nova-compute[71972]: tempest-AttachVolumeShelveTestJSON-1051644628 Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: OpenStack Foundation Apr 17 22:11:53 user nova-compute[71972]: OpenStack Nova Apr 17 22:11:53 user nova-compute[71972]: 0.0.0 Apr 17 22:11:53 user nova-compute[71972]: 2b53a15a-2e55-4c9e-976b-addb176545fa Apr 17 22:11:53 user nova-compute[71972]: 2b53a15a-2e55-4c9e-976b-addb176545fa Apr 17 22:11:53 user nova-compute[71972]: Virtual Machine Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: hvm Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Nehalem Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: /dev/urandom Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: Apr 17 22:11:53 user nova-compute[71972]: {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:11:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-742045578',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-742045578',id=15,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPQR3wOyHdbetUVTI8L6ivA4S33oxQgPScnR6ThbG577rtQzESbVwvoSv8WcOg2mDZIkUxurn2f81Gs5LsSgI5VUppArPBavq2+Gv6ZDUuSwGAQdnLviswAseye+/hTX7A==',key_name='tempest-keypair-41502610',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9283fe3c9a094f9bbddb08e48973da44',ramdisk_id='',reservation_id='r-a0ylbpui',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-1051644628',owner_user_name='tempest-AttachVolumeShelveTestJSON-1051644628-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:11:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a5ec05fe7d7244368c7eec3739a96c19',uuid=2b53a15a-2e55-4c9e-976b-addb176545fa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7", "address": "fa:16:3e:0a:f9:e7", "network": {"id": "2aae2552-3ef8-41d3-84e6-313da6fc203b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1626967335-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9283fe3c9a094f9bbddb08e48973da44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2515c5ff-fa", "ovs_interfaceid": "2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Converting VIF {"id": "2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7", "address": "fa:16:3e:0a:f9:e7", "network": {"id": "2aae2552-3ef8-41d3-84e6-313da6fc203b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1626967335-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9283fe3c9a094f9bbddb08e48973da44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2515c5ff-fa", "ovs_interfaceid": "2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0a:f9:e7,bridge_name='br-int',has_traffic_filtering=True,id=2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7,network=Network(2aae2552-3ef8-41d3-84e6-313da6fc203b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2515c5ff-fa') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG os_vif [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:f9:e7,bridge_name='br-int',has_traffic_filtering=True,id=2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7,network=Network(2aae2552-3ef8-41d3-84e6-313da6fc203b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2515c5ff-fa') {{(pid=71972) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2515c5ff-fa, may_exist=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2515c5ff-fa, col_values=(('external_ids', {'iface-id': '2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0a:f9:e7', 'vm-uuid': '2b53a15a-2e55-4c9e-976b-addb176545fa'}),)) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:53 user nova-compute[71972]: INFO os_vif [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0a:f9:e7,bridge_name='br-int',has_traffic_filtering=True,id=2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7,network=Network(2aae2552-3ef8-41d3-84e6-313da6fc203b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2515c5ff-fa') Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] No BDM found with device name vda, not building metadata. {{(pid=71972) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] No VIF found with MAC fa:16:3e:0a:f9:e7, not building metadata {{(pid=71972) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.network.neutron [req-b91fcdd1-5e07-4460-899a-30137ad15acb req-18062dce-9ac9-4b6b-b278-0b534ce2b97f service nova] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Updated VIF entry in instance network info cache for port 2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG nova.network.neutron [req-b91fcdd1-5e07-4460-899a-30137ad15acb req-18062dce-9ac9-4b6b-b278-0b534ce2b97f service nova] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Updating instance_info_cache with network_info: [{"id": "2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7", "address": "fa:16:3e:0a:f9:e7", "network": {"id": "2aae2552-3ef8-41d3-84e6-313da6fc203b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1626967335-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9283fe3c9a094f9bbddb08e48973da44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2515c5ff-fa", "ovs_interfaceid": "2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:11:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-b91fcdd1-5e07-4460-899a-30137ad15acb req-18062dce-9ac9-4b6b-b278-0b534ce2b97f service nova] Releasing lock "refresh_cache-2b53a15a-2e55-4c9e-976b-addb176545fa" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:11:54 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Resumed> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:11:54 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] VM Resumed (Lifecycle Event) Apr 17 22:11:54 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Instance event wait completed in 0 seconds for {{(pid=71972) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 22:11:54 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Guest created on hypervisor {{(pid=71972) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 22:11:54 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:11:54 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Instance spawned successfully. Apr 17 22:11:54 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 22:11:54 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:11:54 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:54 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Found default for hw_cdrom_bus of ide {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:11:54 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Found default for hw_disk_bus of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:11:54 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Found default for hw_input_bus of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:11:54 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Found default for hw_pointer_model of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:11:54 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Found default for hw_video_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:11:54 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Found default for hw_vif_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:11:54 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:54 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:11:54 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Started> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:11:54 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] VM Started (Lifecycle Event) Apr 17 22:11:54 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:54 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:11:54 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:11:54 user nova-compute[71972]: INFO nova.compute.manager [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Took 6.73 seconds to spawn the instance on the hypervisor. Apr 17 22:11:54 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:11:54 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:11:55 user nova-compute[71972]: INFO nova.compute.manager [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Took 7.45 seconds to build instance. Apr 17 22:11:55 user nova-compute[71972]: DEBUG nova.compute.manager [req-9e689023-848f-4a1d-8aa8-a6899550a9f9 req-6b218b60-77f8-409a-944b-f43dc02256e4 service nova] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Received event network-vif-plugged-2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:11:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-9e689023-848f-4a1d-8aa8-a6899550a9f9 req-6b218b60-77f8-409a-944b-f43dc02256e4 service nova] Acquiring lock "2b53a15a-2e55-4c9e-976b-addb176545fa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-9e689023-848f-4a1d-8aa8-a6899550a9f9 req-6b218b60-77f8-409a-944b-f43dc02256e4 service nova] Lock "2b53a15a-2e55-4c9e-976b-addb176545fa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-9e689023-848f-4a1d-8aa8-a6899550a9f9 req-6b218b60-77f8-409a-944b-f43dc02256e4 service nova] Lock "2b53a15a-2e55-4c9e-976b-addb176545fa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:55 user nova-compute[71972]: DEBUG nova.compute.manager [req-9e689023-848f-4a1d-8aa8-a6899550a9f9 req-6b218b60-77f8-409a-944b-f43dc02256e4 service nova] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] No waiting events found dispatching network-vif-plugged-2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:11:55 user nova-compute[71972]: WARNING nova.compute.manager [req-9e689023-848f-4a1d-8aa8-a6899550a9f9 req-6b218b60-77f8-409a-944b-f43dc02256e4 service nova] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Received unexpected event network-vif-plugged-2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7 for instance with vm_state building and task_state spawning. Apr 17 22:11:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f74ba107-639d-4a66-95c2-f01b5dcd3546 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "e4e13341-82c2-4b86-8b5a-e12d435513ee" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.583s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:55 user nova-compute[71972]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:11:55 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] VM Stopped (Lifecycle Event) Apr 17 22:11:55 user nova-compute[71972]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:11:55 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] VM Stopped (Lifecycle Event) Apr 17 22:11:55 user nova-compute[71972]: DEBUG nova.compute.manager [req-8c4b8c9c-ff61-453d-892e-562b5b55b7aa req-c2840427-59bb-4bd7-8185-8634b3870d23 service nova] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Received event network-vif-plugged-ea831325-55ac-45ca-ab26-4b424e66ca77 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:11:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-8c4b8c9c-ff61-453d-892e-562b5b55b7aa req-c2840427-59bb-4bd7-8185-8634b3870d23 service nova] Acquiring lock "e4e13341-82c2-4b86-8b5a-e12d435513ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-8c4b8c9c-ff61-453d-892e-562b5b55b7aa req-c2840427-59bb-4bd7-8185-8634b3870d23 service nova] Lock "e4e13341-82c2-4b86-8b5a-e12d435513ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-8c4b8c9c-ff61-453d-892e-562b5b55b7aa req-c2840427-59bb-4bd7-8185-8634b3870d23 service nova] Lock "e4e13341-82c2-4b86-8b5a-e12d435513ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:55 user nova-compute[71972]: DEBUG nova.compute.manager [req-8c4b8c9c-ff61-453d-892e-562b5b55b7aa req-c2840427-59bb-4bd7-8185-8634b3870d23 service nova] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] No waiting events found dispatching network-vif-plugged-ea831325-55ac-45ca-ab26-4b424e66ca77 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:11:55 user nova-compute[71972]: WARNING nova.compute.manager [req-8c4b8c9c-ff61-453d-892e-562b5b55b7aa req-c2840427-59bb-4bd7-8185-8634b3870d23 service nova] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Received unexpected event network-vif-plugged-ea831325-55ac-45ca-ab26-4b424e66ca77 for instance with vm_state active and task_state None. Apr 17 22:11:55 user nova-compute[71972]: DEBUG nova.compute.manager [None req-11ab7c79-86d1-404a-ab19-8239e21ada55 None None] [instance: 04e443ff-f9f1-4424-847d-ea7557307ec6] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:11:55 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2875619c-a265-4cbd-8a6f-ca6270db96cd None None] [instance: 0c76babb-25fa-4e8b-9e09-c705153a95e5] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:11:55 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:55 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:55 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:56 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:57 user nova-compute[71972]: DEBUG nova.compute.manager [req-0efce87f-61d5-431a-ac5d-0d581e1bc553 req-b44abc8b-6a9a-4a35-95b2-15ad7c77b946 service nova] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Received event network-vif-plugged-2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:11:57 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-0efce87f-61d5-431a-ac5d-0d581e1bc553 req-b44abc8b-6a9a-4a35-95b2-15ad7c77b946 service nova] Acquiring lock "2b53a15a-2e55-4c9e-976b-addb176545fa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:11:57 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-0efce87f-61d5-431a-ac5d-0d581e1bc553 req-b44abc8b-6a9a-4a35-95b2-15ad7c77b946 service nova] Lock "2b53a15a-2e55-4c9e-976b-addb176545fa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:11:57 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-0efce87f-61d5-431a-ac5d-0d581e1bc553 req-b44abc8b-6a9a-4a35-95b2-15ad7c77b946 service nova] Lock "2b53a15a-2e55-4c9e-976b-addb176545fa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:57 user nova-compute[71972]: DEBUG nova.compute.manager [req-0efce87f-61d5-431a-ac5d-0d581e1bc553 req-b44abc8b-6a9a-4a35-95b2-15ad7c77b946 service nova] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] No waiting events found dispatching network-vif-plugged-2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:11:57 user nova-compute[71972]: WARNING nova.compute.manager [req-0efce87f-61d5-431a-ac5d-0d581e1bc553 req-b44abc8b-6a9a-4a35-95b2-15ad7c77b946 service nova] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Received unexpected event network-vif-plugged-2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7 for instance with vm_state building and task_state spawning. Apr 17 22:11:57 user nova-compute[71972]: DEBUG nova.compute.manager [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Instance event wait completed in 0 seconds for {{(pid=71972) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 22:11:57 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Guest created on hypervisor {{(pid=71972) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 22:11:57 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Resumed> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:11:57 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] VM Resumed (Lifecycle Event) Apr 17 22:11:57 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Instance spawned successfully. Apr 17 22:11:57 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 22:11:57 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:11:57 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Found default for hw_cdrom_bus of ide {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:11:57 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Found default for hw_disk_bus of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:11:57 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Found default for hw_input_bus of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:11:57 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Found default for hw_pointer_model of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:11:57 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Found default for hw_video_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:11:57 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Found default for hw_vif_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:11:57 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:11:57 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:11:57 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Started> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:11:57 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] VM Started (Lifecycle Event) Apr 17 22:11:57 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:11:57 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:11:57 user nova-compute[71972]: INFO nova.compute.manager [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Took 6.33 seconds to spawn the instance on the hypervisor. Apr 17 22:11:57 user nova-compute[71972]: DEBUG nova.compute.manager [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:11:57 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:11:57 user nova-compute[71972]: INFO nova.compute.manager [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Took 7.13 seconds to build instance. Apr 17 22:11:57 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-6292db7e-dad9-44b5-bf16-433c7a3c5d61 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "2b53a15a-2e55-4c9e-976b-addb176545fa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.254s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:11:57 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:58 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:11:58 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:00 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:03 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:03 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:03 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:07 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:07 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:08 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:10 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:12 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:12 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:13 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:16 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:17 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:18 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:20 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:22 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:23 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:27 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:27 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:28 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:30 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:33 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Acquiring lock "cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:12:33 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:12:33 user nova-compute[71972]: DEBUG nova.compute.manager [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Starting instance... {{(pid=71972) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 22:12:33 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:33 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:12:33 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:12:33 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71972) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 22:12:33 user nova-compute[71972]: INFO nova.compute.claims [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Claim successful on node user Apr 17 22:12:33 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:12:33 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:12:33 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.396s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:12:33 user nova-compute[71972]: DEBUG nova.compute.manager [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Start building networks asynchronously for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 22:12:33 user nova-compute[71972]: DEBUG nova.compute.manager [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Allocating IP information in the background. {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 22:12:33 user nova-compute[71972]: DEBUG nova.network.neutron [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] allocate_for_instance() {{(pid=71972) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 22:12:34 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 22:12:34 user nova-compute[71972]: DEBUG nova.compute.manager [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Start building block device mappings for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 22:12:34 user nova-compute[71972]: DEBUG nova.policy [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e11872391f1a487a8a8ba5a6d13589f1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a82db257b3494faca3f3759644a51b30', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71972) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 22:12:34 user nova-compute[71972]: DEBUG nova.compute.manager [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Start spawning the instance on the hypervisor. {{(pid=71972) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 22:12:34 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Creating instance directory {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 22:12:34 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Creating image(s) Apr 17 22:12:34 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Acquiring lock "/opt/stack/data/nova/instances/cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:12:34 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "/opt/stack/data/nova/instances/cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:12:34 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "/opt/stack/data/nova/instances/cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:12:34 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Acquiring lock "04efc14f4cb3ec7da7e0c7cc196bf1fb8ecebf7e" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:12:34 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "04efc14f4cb3ec7da7e0c7cc196bf1fb8ecebf7e" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:12:34 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/04efc14f4cb3ec7da7e0c7cc196bf1fb8ecebf7e.part --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:12:34 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/04efc14f4cb3ec7da7e0c7cc196bf1fb8ecebf7e.part --force-share --output=json" returned: 0 in 0.143s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:12:34 user nova-compute[71972]: DEBUG nova.virt.images [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] 7caab0a1-de3d-4117-b612-096de189bac9 was qcow2, converting to raw {{(pid=71972) fetch_to_raw /opt/stack/nova/nova/virt/images.py:165}} Apr 17 22:12:34 user nova-compute[71972]: DEBUG nova.privsep.utils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71972) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 17 22:12:34 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/04efc14f4cb3ec7da7e0c7cc196bf1fb8ecebf7e.part /opt/stack/data/nova/instances/_base/04efc14f4cb3ec7da7e0c7cc196bf1fb8ecebf7e.converted {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:12:34 user nova-compute[71972]: DEBUG nova.network.neutron [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Successfully created port: bc18e018-6929-44c7-be4a-53fc82ef85e1 {{(pid=71972) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] CMD "qemu-img convert -t none -O raw -f qcow2 /opt/stack/data/nova/instances/_base/04efc14f4cb3ec7da7e0c7cc196bf1fb8ecebf7e.part /opt/stack/data/nova/instances/_base/04efc14f4cb3ec7da7e0c7cc196bf1fb8ecebf7e.converted" returned: 0 in 0.255s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/04efc14f4cb3ec7da7e0c7cc196bf1fb8ecebf7e.converted --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/04efc14f4cb3ec7da7e0c7cc196bf1fb8ecebf7e.converted --force-share --output=json" returned: 0 in 0.139s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "04efc14f4cb3ec7da7e0c7cc196bf1fb8ecebf7e" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 1.156s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/04efc14f4cb3ec7da7e0c7cc196bf1fb8ecebf7e --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/04efc14f4cb3ec7da7e0c7cc196bf1fb8ecebf7e --force-share --output=json" returned: 0 in 0.143s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Acquiring lock "04efc14f4cb3ec7da7e0c7cc196bf1fb8ecebf7e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "04efc14f4cb3ec7da7e0c7cc196bf1fb8ecebf7e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/04efc14f4cb3ec7da7e0c7cc196bf1fb8ecebf7e --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/04efc14f4cb3ec7da7e0c7cc196bf1fb8ecebf7e --force-share --output=json" returned: 0 in 0.134s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/04efc14f4cb3ec7da7e0c7cc196bf1fb8ecebf7e,backing_fmt=raw /opt/stack/data/nova/instances/cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b/disk 1073741824 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/04efc14f4cb3ec7da7e0c7cc196bf1fb8ecebf7e,backing_fmt=raw /opt/stack/data/nova/instances/cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b/disk 1073741824" returned: 0 in 0.055s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "04efc14f4cb3ec7da7e0c7cc196bf1fb8ecebf7e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.195s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/04efc14f4cb3ec7da7e0c7cc196bf1fb8ecebf7e --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG nova.network.neutron [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Successfully updated port: bc18e018-6929-44c7-be4a-53fc82ef85e1 {{(pid=71972) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Acquiring lock "refresh_cache-cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Acquired lock "refresh_cache-cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG nova.network.neutron [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Building network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG nova.compute.manager [req-e2bb87be-a9db-48d0-99c7-9e731be5e591 req-5f9d284c-03cb-40d9-8cd7-b7ec27d9067c service nova] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Received event network-changed-bc18e018-6929-44c7-be4a-53fc82ef85e1 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG nova.compute.manager [req-e2bb87be-a9db-48d0-99c7-9e731be5e591 req-5f9d284c-03cb-40d9-8cd7-b7ec27d9067c service nova] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Refreshing instance network info cache due to event network-changed-bc18e018-6929-44c7-be4a-53fc82ef85e1. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-e2bb87be-a9db-48d0-99c7-9e731be5e591 req-5f9d284c-03cb-40d9-8cd7-b7ec27d9067c service nova] Acquiring lock "refresh_cache-cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3891d777-81e2-44cf-b346-37ba3027c547 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Acquiring lock "d7c93039-48e6-40b8-b921-d6eb1ebe78ef" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3891d777-81e2-44cf-b346-37ba3027c547 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "d7c93039-48e6-40b8-b921-d6eb1ebe78ef" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3891d777-81e2-44cf-b346-37ba3027c547 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Acquiring lock "d7c93039-48e6-40b8-b921-d6eb1ebe78ef-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3891d777-81e2-44cf-b346-37ba3027c547 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "d7c93039-48e6-40b8-b921-d6eb1ebe78ef-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3891d777-81e2-44cf-b346-37ba3027c547 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "d7c93039-48e6-40b8-b921-d6eb1ebe78ef-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:12:35 user nova-compute[71972]: INFO nova.compute.manager [None req-3891d777-81e2-44cf-b346-37ba3027c547 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Terminating instance Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/04efc14f4cb3ec7da7e0c7cc196bf1fb8ecebf7e --force-share --output=json" returned: 0 in 0.156s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Checking if we can resize image /opt/stack/data/nova/instances/cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b/disk. size=1073741824 {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG nova.compute.manager [None req-3891d777-81e2-44cf-b346-37ba3027c547 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Start destroying the instance on the hypervisor. {{(pid=71972) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG nova.network.neutron [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Instance cache missing network info. {{(pid=71972) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Cannot resize image /opt/stack/data/nova/instances/cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b/disk to a smaller size. {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG nova.objects.instance [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lazy-loading 'migration_context' on Instance uuid cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Created local disks {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Ensure instance console log exists: /opt/stack/data/nova/instances/cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b/console.log {{(pid=71972) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:12:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.compute.manager [req-5fddcd66-901c-49a4-a17e-e33db2414dbd req-c9ed94bf-5a6d-44c4-8ac9-282b6f8c50f2 service nova] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Received event network-vif-unplugged-b377f91d-95cf-42f6-8cb9-62aa8d68bcb1 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-5fddcd66-901c-49a4-a17e-e33db2414dbd req-c9ed94bf-5a6d-44c4-8ac9-282b6f8c50f2 service nova] Acquiring lock "d7c93039-48e6-40b8-b921-d6eb1ebe78ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-5fddcd66-901c-49a4-a17e-e33db2414dbd req-c9ed94bf-5a6d-44c4-8ac9-282b6f8c50f2 service nova] Lock "d7c93039-48e6-40b8-b921-d6eb1ebe78ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-5fddcd66-901c-49a4-a17e-e33db2414dbd req-c9ed94bf-5a6d-44c4-8ac9-282b6f8c50f2 service nova] Lock "d7c93039-48e6-40b8-b921-d6eb1ebe78ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.compute.manager [req-5fddcd66-901c-49a4-a17e-e33db2414dbd req-c9ed94bf-5a6d-44c4-8ac9-282b6f8c50f2 service nova] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] No waiting events found dispatching network-vif-unplugged-b377f91d-95cf-42f6-8cb9-62aa8d68bcb1 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.compute.manager [req-5fddcd66-901c-49a4-a17e-e33db2414dbd req-c9ed94bf-5a6d-44c4-8ac9-282b6f8c50f2 service nova] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Received event network-vif-unplugged-b377f91d-95cf-42f6-8cb9-62aa8d68bcb1 for instance with task_state deleting. {{(pid=71972) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.network.neutron [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Updating instance_info_cache with network_info: [{"id": "bc18e018-6929-44c7-be4a-53fc82ef85e1", "address": "fa:16:3e:6d:de:30", "network": {"id": "a91184a7-8ecf-419a-a191-2853dd054c4b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1266987381-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a82db257b3494faca3f3759644a51b30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc18e018-69", "ovs_interfaceid": "bc18e018-6929-44c7-be4a-53fc82ef85e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Releasing lock "refresh_cache-cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.compute.manager [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Instance network_info: |[{"id": "bc18e018-6929-44c7-be4a-53fc82ef85e1", "address": "fa:16:3e:6d:de:30", "network": {"id": "a91184a7-8ecf-419a-a191-2853dd054c4b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1266987381-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a82db257b3494faca3f3759644a51b30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc18e018-69", "ovs_interfaceid": "bc18e018-6929-44c7-be4a-53fc82ef85e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-e2bb87be-a9db-48d0-99c7-9e731be5e591 req-5f9d284c-03cb-40d9-8cd7-b7ec27d9067c service nova] Acquired lock "refresh_cache-cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.network.neutron [req-e2bb87be-a9db-48d0-99c7-9e731be5e591 req-5f9d284c-03cb-40d9-8cd7-b7ec27d9067c service nova] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Refreshing network info cache for port bc18e018-6929-44c7-be4a-53fc82ef85e1 {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Start _get_guest_xml network_info=[{"id": "bc18e018-6929-44c7-be4a-53fc82ef85e1", "address": "fa:16:3e:6d:de:30", "network": {"id": "a91184a7-8ecf-419a-a191-2853dd054c4b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1266987381-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a82db257b3494faca3f3759644a51b30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc18e018-69", "ovs_interfaceid": "bc18e018-6929-44c7-be4a-53fc82ef85e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:12:30Z,direct_url=,disk_format='qcow2',id=7caab0a1-de3d-4117-b612-096de189bac9,min_disk=0,min_ram=0,name='tempest-scenario-img--1586491885',owner='a82db257b3494faca3f3759644a51b30',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:12:32Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'image_id': '7caab0a1-de3d-4117-b612-096de189bac9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 22:12:36 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:12:36 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71972) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T22:06:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:12:30Z,direct_url=,disk_format='qcow2',id=7caab0a1-de3d-4117-b612-096de189bac9,min_disk=0,min_ram=0,name='tempest-scenario-img--1586491885',owner='a82db257b3494faca3f3759644a51b30',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:12:32Z,virtual_size=,visibility=), allow threads: True {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Flavor limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Image limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Flavor pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Image pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Got 1 possible topologies {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:12:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1515339523',display_name='tempest-TestMinimumBasicScenario-server-1515339523',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1515339523',id=16,image_ref='7caab0a1-de3d-4117-b612-096de189bac9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCsXE6OrnGHKpZRfRRcidgOPWNw8nV1m4nQYTe67gmTvGnqZQ+yuBJwyAgcKnu1jiPhFMF5q36cDq/h1aU41fT0gQG5M5HcCdSjY9QHTwnJNWH/cKSA+xRQ7GbYrnZAxmQ==',key_name='tempest-TestMinimumBasicScenario-2023545159',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a82db257b3494faca3f3759644a51b30',ramdisk_id='',reservation_id='r-xr2x064l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7caab0a1-de3d-4117-b612-096de189bac9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-475067891',owner_user_name='tempest-TestMinimumBasicScenario-475067891-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:12:34Z,user_data=None,user_id='e11872391f1a487a8a8ba5a6d13589f1',uuid=cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bc18e018-6929-44c7-be4a-53fc82ef85e1", "address": "fa:16:3e:6d:de:30", "network": {"id": "a91184a7-8ecf-419a-a191-2853dd054c4b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1266987381-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a82db257b3494faca3f3759644a51b30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc18e018-69", "ovs_interfaceid": "bc18e018-6929-44c7-be4a-53fc82ef85e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71972) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Converting VIF {"id": "bc18e018-6929-44c7-be4a-53fc82ef85e1", "address": "fa:16:3e:6d:de:30", "network": {"id": "a91184a7-8ecf-419a-a191-2853dd054c4b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1266987381-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a82db257b3494faca3f3759644a51b30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc18e018-69", "ovs_interfaceid": "bc18e018-6929-44c7-be4a-53fc82ef85e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:de:30,bridge_name='br-int',has_traffic_filtering=True,id=bc18e018-6929-44c7-be4a-53fc82ef85e1,network=Network(a91184a7-8ecf-419a-a191-2853dd054c4b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc18e018-69') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.objects.instance [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lazy-loading 'pci_devices' on Instance uuid cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] End _get_guest_xml xml= Apr 17 22:12:36 user nova-compute[71972]: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b Apr 17 22:12:36 user nova-compute[71972]: instance-00000010 Apr 17 22:12:36 user nova-compute[71972]: 131072 Apr 17 22:12:36 user nova-compute[71972]: 1 Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: tempest-TestMinimumBasicScenario-server-1515339523 Apr 17 22:12:36 user nova-compute[71972]: 2023-04-17 22:12:36 Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: 128 Apr 17 22:12:36 user nova-compute[71972]: 1 Apr 17 22:12:36 user nova-compute[71972]: 0 Apr 17 22:12:36 user nova-compute[71972]: 0 Apr 17 22:12:36 user nova-compute[71972]: 1 Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: tempest-TestMinimumBasicScenario-475067891-project-member Apr 17 22:12:36 user nova-compute[71972]: tempest-TestMinimumBasicScenario-475067891 Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: OpenStack Foundation Apr 17 22:12:36 user nova-compute[71972]: OpenStack Nova Apr 17 22:12:36 user nova-compute[71972]: 0.0.0 Apr 17 22:12:36 user nova-compute[71972]: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b Apr 17 22:12:36 user nova-compute[71972]: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b Apr 17 22:12:36 user nova-compute[71972]: Virtual Machine Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: hvm Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Nehalem Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: /dev/urandom Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: Apr 17 22:12:36 user nova-compute[71972]: {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:12:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1515339523',display_name='tempest-TestMinimumBasicScenario-server-1515339523',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1515339523',id=16,image_ref='7caab0a1-de3d-4117-b612-096de189bac9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCsXE6OrnGHKpZRfRRcidgOPWNw8nV1m4nQYTe67gmTvGnqZQ+yuBJwyAgcKnu1jiPhFMF5q36cDq/h1aU41fT0gQG5M5HcCdSjY9QHTwnJNWH/cKSA+xRQ7GbYrnZAxmQ==',key_name='tempest-TestMinimumBasicScenario-2023545159',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a82db257b3494faca3f3759644a51b30',ramdisk_id='',reservation_id='r-xr2x064l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7caab0a1-de3d-4117-b612-096de189bac9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-475067891',owner_user_name='tempest-TestMinimumBasicScenario-475067891-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:12:34Z,user_data=None,user_id='e11872391f1a487a8a8ba5a6d13589f1',uuid=cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bc18e018-6929-44c7-be4a-53fc82ef85e1", "address": "fa:16:3e:6d:de:30", "network": {"id": "a91184a7-8ecf-419a-a191-2853dd054c4b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1266987381-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a82db257b3494faca3f3759644a51b30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc18e018-69", "ovs_interfaceid": "bc18e018-6929-44c7-be4a-53fc82ef85e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Converting VIF {"id": "bc18e018-6929-44c7-be4a-53fc82ef85e1", "address": "fa:16:3e:6d:de:30", "network": {"id": "a91184a7-8ecf-419a-a191-2853dd054c4b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1266987381-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a82db257b3494faca3f3759644a51b30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc18e018-69", "ovs_interfaceid": "bc18e018-6929-44c7-be4a-53fc82ef85e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:de:30,bridge_name='br-int',has_traffic_filtering=True,id=bc18e018-6929-44c7-be4a-53fc82ef85e1,network=Network(a91184a7-8ecf-419a-a191-2853dd054c4b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc18e018-69') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG os_vif [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:de:30,bridge_name='br-int',has_traffic_filtering=True,id=bc18e018-6929-44c7-be4a-53fc82ef85e1,network=Network(a91184a7-8ecf-419a-a191-2853dd054c4b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc18e018-69') {{(pid=71972) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbc18e018-69, may_exist=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbc18e018-69, col_values=(('external_ids', {'iface-id': 'bc18e018-6929-44c7-be4a-53fc82ef85e1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6d:de:30', 'vm-uuid': 'cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b'}),)) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:36 user nova-compute[71972]: INFO os_vif [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:de:30,bridge_name='br-int',has_traffic_filtering=True,id=bc18e018-6929-44c7-be4a-53fc82ef85e1,network=Network(a91184a7-8ecf-419a-a191-2853dd054c4b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc18e018-69') Apr 17 22:12:36 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Instance destroyed successfully. Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.objects.instance [None req-3891d777-81e2-44cf-b346-37ba3027c547 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lazy-loading 'resources' on Instance uuid d7c93039-48e6-40b8-b921-d6eb1ebe78ef {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-3891d777-81e2-44cf-b346-37ba3027c547 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:10:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-54695480',display_name='tempest-ServersNegativeTestJSON-server-54695480',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-54695480',id=12,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-17T22:10:53Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='f21699c3400842d3a28e71b288a4aaff',ramdisk_id='',reservation_id='r-fmibn9o9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersNegativeTestJSON-1844623378',owner_user_name='tempest-ServersNegativeTestJSON-1844623378-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T22:10:54Z,user_data=None,user_id='51c0b269c97241d9ad122b23af3ca7ea',uuid=d7c93039-48e6-40b8-b921-d6eb1ebe78ef,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b377f91d-95cf-42f6-8cb9-62aa8d68bcb1", "address": "fa:16:3e:6a:32:e4", "network": {"id": "3bbe159c-29cd-4095-9556-8169500b1716", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-840222337-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f21699c3400842d3a28e71b288a4aaff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb377f91d-95", "ovs_interfaceid": "b377f91d-95cf-42f6-8cb9-62aa8d68bcb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-3891d777-81e2-44cf-b346-37ba3027c547 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Converting VIF {"id": "b377f91d-95cf-42f6-8cb9-62aa8d68bcb1", "address": "fa:16:3e:6a:32:e4", "network": {"id": "3bbe159c-29cd-4095-9556-8169500b1716", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-840222337-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f21699c3400842d3a28e71b288a4aaff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapb377f91d-95", "ovs_interfaceid": "b377f91d-95cf-42f6-8cb9-62aa8d68bcb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-3891d777-81e2-44cf-b346-37ba3027c547 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:32:e4,bridge_name='br-int',has_traffic_filtering=True,id=b377f91d-95cf-42f6-8cb9-62aa8d68bcb1,network=Network(3bbe159c-29cd-4095-9556-8169500b1716),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb377f91d-95') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG os_vif [None req-3891d777-81e2-44cf-b346-37ba3027c547 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:32:e4,bridge_name='br-int',has_traffic_filtering=True,id=b377f91d-95cf-42f6-8cb9-62aa8d68bcb1,network=Network(3bbe159c-29cd-4095-9556-8169500b1716),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb377f91d-95') {{(pid=71972) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb377f91d-95, bridge=br-int, if_exists=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:36 user nova-compute[71972]: INFO os_vif [None req-3891d777-81e2-44cf-b346-37ba3027c547 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:32:e4,bridge_name='br-int',has_traffic_filtering=True,id=b377f91d-95cf-42f6-8cb9-62aa8d68bcb1,network=Network(3bbe159c-29cd-4095-9556-8169500b1716),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb377f91d-95') Apr 17 22:12:36 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-3891d777-81e2-44cf-b346-37ba3027c547 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Deleting instance files /opt/stack/data/nova/instances/d7c93039-48e6-40b8-b921-d6eb1ebe78ef_del Apr 17 22:12:36 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-3891d777-81e2-44cf-b346-37ba3027c547 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Deletion of /opt/stack/data/nova/instances/d7c93039-48e6-40b8-b921-d6eb1ebe78ef_del complete Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] No BDM found with device name vda, not building metadata. {{(pid=71972) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] No VIF found with MAC fa:16:3e:6d:de:30, not building metadata {{(pid=71972) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 22:12:36 user nova-compute[71972]: INFO nova.compute.manager [None req-3891d777-81e2-44cf-b346-37ba3027c547 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Took 0.66 seconds to destroy the instance on the hypervisor. Apr 17 22:12:36 user nova-compute[71972]: DEBUG oslo.service.loopingcall [None req-3891d777-81e2-44cf-b346-37ba3027c547 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71972) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.compute.manager [-] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Deallocating network for instance {{(pid=71972) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] deallocate_for_instance() {{(pid=71972) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Auditing locally available compute resources for user (node: user) {{(pid=71972) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.network.neutron [req-e2bb87be-a9db-48d0-99c7-9e731be5e591 req-5f9d284c-03cb-40d9-8cd7-b7ec27d9067c service nova] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Updated VIF entry in instance network info cache for port bc18e018-6929-44c7-be4a-53fc82ef85e1. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.network.neutron [req-e2bb87be-a9db-48d0-99c7-9e731be5e591 req-5f9d284c-03cb-40d9-8cd7-b7ec27d9067c service nova] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Updating instance_info_cache with network_info: [{"id": "bc18e018-6929-44c7-be4a-53fc82ef85e1", "address": "fa:16:3e:6d:de:30", "network": {"id": "a91184a7-8ecf-419a-a191-2853dd054c4b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1266987381-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a82db257b3494faca3f3759644a51b30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc18e018-69", "ovs_interfaceid": "bc18e018-6929-44c7-be4a-53fc82ef85e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-e2bb87be-a9db-48d0-99c7-9e731be5e591 req-5f9d284c-03cb-40d9-8cd7-b7ec27d9067c service nova] Releasing lock "refresh_cache-cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:12:36 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:12:36 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Took 0.47 seconds to deallocate network for instance. Apr 17 22:12:37 user nova-compute[71972]: DEBUG nova.compute.manager [req-9c4d0886-6017-4d5b-b98e-3b14b4c6900a req-9724ec38-e80f-43fd-aa45-617af88f6713 service nova] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Received event network-vif-deleted-b377f91d-95cf-42f6-8cb9-62aa8d68bcb1 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:12:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3891d777-81e2-44cf-b346-37ba3027c547 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:12:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3891d777-81e2-44cf-b346-37ba3027c547 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:12:37 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-3891d777-81e2-44cf-b346-37ba3027c547 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:12:37 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-3891d777-81e2-44cf-b346-37ba3027c547 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:12:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3891d777-81e2-44cf-b346-37ba3027c547 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.339s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:12:37 user nova-compute[71972]: INFO nova.scheduler.client.report [None req-3891d777-81e2-44cf-b346-37ba3027c547 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Deleted allocations for instance d7c93039-48e6-40b8-b921-d6eb1ebe78ef Apr 17 22:12:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3891d777-81e2-44cf-b346-37ba3027c547 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "d7c93039-48e6-40b8-b921-d6eb1ebe78ef" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.711s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:12:37 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:37 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:37 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:38 user nova-compute[71972]: DEBUG nova.compute.manager [req-1c0a6c94-66e6-4a42-adfc-8459f72898c1 req-b38d99f8-96e8-4c49-a080-c974dedb63bb service nova] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Received event network-vif-plugged-b377f91d-95cf-42f6-8cb9-62aa8d68bcb1 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:12:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-1c0a6c94-66e6-4a42-adfc-8459f72898c1 req-b38d99f8-96e8-4c49-a080-c974dedb63bb service nova] Acquiring lock "d7c93039-48e6-40b8-b921-d6eb1ebe78ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:12:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-1c0a6c94-66e6-4a42-adfc-8459f72898c1 req-b38d99f8-96e8-4c49-a080-c974dedb63bb service nova] Lock "d7c93039-48e6-40b8-b921-d6eb1ebe78ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:12:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-1c0a6c94-66e6-4a42-adfc-8459f72898c1 req-b38d99f8-96e8-4c49-a080-c974dedb63bb service nova] Lock "d7c93039-48e6-40b8-b921-d6eb1ebe78ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:12:38 user nova-compute[71972]: DEBUG nova.compute.manager [req-1c0a6c94-66e6-4a42-adfc-8459f72898c1 req-b38d99f8-96e8-4c49-a080-c974dedb63bb service nova] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] No waiting events found dispatching network-vif-plugged-b377f91d-95cf-42f6-8cb9-62aa8d68bcb1 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:12:38 user nova-compute[71972]: WARNING nova.compute.manager [req-1c0a6c94-66e6-4a42-adfc-8459f72898c1 req-b38d99f8-96e8-4c49-a080-c974dedb63bb service nova] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Received unexpected event network-vif-plugged-b377f91d-95cf-42f6-8cb9-62aa8d68bcb1 for instance with vm_state deleted and task_state None. Apr 17 22:12:38 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:12:38 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/disk --force-share --output=json" returned: 0 in 0.150s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:12:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:12:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/disk --force-share --output=json" returned: 0 in 0.173s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:12:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e4e13341-82c2-4b86-8b5a-e12d435513ee/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:12:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e4e13341-82c2-4b86-8b5a-e12d435513ee/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:12:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e4e13341-82c2-4b86-8b5a-e12d435513ee/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:12:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e4e13341-82c2-4b86-8b5a-e12d435513ee/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:12:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk.rescue --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:12:39 user nova-compute[71972]: DEBUG nova.compute.manager [req-5c65d4c4-e140-43f2-939c-c96afe19d89a req-670f21d4-7198-419c-88b1-93834dd6b06d service nova] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Received event network-vif-plugged-bc18e018-6929-44c7-be4a-53fc82ef85e1 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:12:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-5c65d4c4-e140-43f2-939c-c96afe19d89a req-670f21d4-7198-419c-88b1-93834dd6b06d service nova] Acquiring lock "cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:12:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-5c65d4c4-e140-43f2-939c-c96afe19d89a req-670f21d4-7198-419c-88b1-93834dd6b06d service nova] Lock "cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:12:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-5c65d4c4-e140-43f2-939c-c96afe19d89a req-670f21d4-7198-419c-88b1-93834dd6b06d service nova] Lock "cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:12:39 user nova-compute[71972]: DEBUG nova.compute.manager [req-5c65d4c4-e140-43f2-939c-c96afe19d89a req-670f21d4-7198-419c-88b1-93834dd6b06d service nova] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] No waiting events found dispatching network-vif-plugged-bc18e018-6929-44c7-be4a-53fc82ef85e1 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:12:39 user nova-compute[71972]: WARNING nova.compute.manager [req-5c65d4c4-e140-43f2-939c-c96afe19d89a req-670f21d4-7198-419c-88b1-93834dd6b06d service nova] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Received unexpected event network-vif-plugged-bc18e018-6929-44c7-be4a-53fc82ef85e1 for instance with vm_state building and task_state spawning. Apr 17 22:12:39 user nova-compute[71972]: DEBUG nova.compute.manager [req-5c65d4c4-e140-43f2-939c-c96afe19d89a req-670f21d4-7198-419c-88b1-93834dd6b06d service nova] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Received event network-vif-plugged-bc18e018-6929-44c7-be4a-53fc82ef85e1 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:12:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-5c65d4c4-e140-43f2-939c-c96afe19d89a req-670f21d4-7198-419c-88b1-93834dd6b06d service nova] Acquiring lock "cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:12:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-5c65d4c4-e140-43f2-939c-c96afe19d89a req-670f21d4-7198-419c-88b1-93834dd6b06d service nova] Lock "cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:12:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-5c65d4c4-e140-43f2-939c-c96afe19d89a req-670f21d4-7198-419c-88b1-93834dd6b06d service nova] Lock "cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:12:39 user nova-compute[71972]: DEBUG nova.compute.manager [req-5c65d4c4-e140-43f2-939c-c96afe19d89a req-670f21d4-7198-419c-88b1-93834dd6b06d service nova] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] No waiting events found dispatching network-vif-plugged-bc18e018-6929-44c7-be4a-53fc82ef85e1 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:12:39 user nova-compute[71972]: WARNING nova.compute.manager [req-5c65d4c4-e140-43f2-939c-c96afe19d89a req-670f21d4-7198-419c-88b1-93834dd6b06d service nova] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Received unexpected event network-vif-plugged-bc18e018-6929-44c7-be4a-53fc82ef85e1 for instance with vm_state building and task_state spawning. Apr 17 22:12:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk.rescue --force-share --output=json" returned: 0 in 0.138s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:12:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk.rescue --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:12:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk.rescue --force-share --output=json" returned: 0 in 0.144s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:12:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:12:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk --force-share --output=json" returned: 0 in 0.159s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:12:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:12:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:12:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2b53a15a-2e55-4c9e-976b-addb176545fa/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:12:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2b53a15a-2e55-4c9e-976b-addb176545fa/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:12:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2b53a15a-2e55-4c9e-976b-addb176545fa/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:12:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2b53a15a-2e55-4c9e-976b-addb176545fa/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:12:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:12:39 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Resumed> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:12:39 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] VM Resumed (Lifecycle Event) Apr 17 22:12:39 user nova-compute[71972]: DEBUG nova.compute.manager [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Instance event wait completed in 0 seconds for {{(pid=71972) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 22:12:39 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Guest created on hypervisor {{(pid=71972) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 22:12:39 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Instance spawned successfully. Apr 17 22:12:39 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 22:12:39 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:12:39 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Found default for hw_cdrom_bus of ide {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:12:39 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Found default for hw_disk_bus of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:12:39 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Found default for hw_input_bus of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:12:39 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Found default for hw_pointer_model of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:12:39 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Found default for hw_video_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:12:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Found default for hw_vif_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:12:40 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:12:40 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:12:40 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Started> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:12:40 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] VM Started (Lifecycle Event) Apr 17 22:12:40 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b/disk --force-share --output=json" returned: 0 in 0.157s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:12:40 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:12:40 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:12:40 user nova-compute[71972]: INFO nova.compute.manager [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Took 5.96 seconds to spawn the instance on the hypervisor. Apr 17 22:12:40 user nova-compute[71972]: DEBUG nova.compute.manager [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:12:40 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:12:40 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:12:40 user nova-compute[71972]: INFO nova.compute.manager [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Took 6.70 seconds to build instance. Apr 17 22:12:40 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:12:40 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/128ffde0-5149-48d2-a56e-c41418fbc753/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:12:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-af667c16-0922-4b67-845b-494951d68ea0 tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.818s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:12:40 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/128ffde0-5149-48d2-a56e-c41418fbc753/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:12:40 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/128ffde0-5149-48d2-a56e-c41418fbc753/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:12:40 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/128ffde0-5149-48d2-a56e-c41418fbc753/disk --force-share --output=json" returned: 0 in 0.273s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:12:40 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:12:40 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json" returned: 0 in 0.157s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:12:40 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:12:40 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:12:40 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d9386728-7c3f-42ff-8f1c-51748ccefff3/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:12:41 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d9386728-7c3f-42ff-8f1c-51748ccefff3/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:12:41 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d9386728-7c3f-42ff-8f1c-51748ccefff3/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:12:41 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d9386728-7c3f-42ff-8f1c-51748ccefff3/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:12:41 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:12:41 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:12:41 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:12:41 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:41 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:12:42 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:12:42 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:12:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Hypervisor/Node resource view: name=user free_ram=8006MB free_disk=26.498924255371094GB free_vcpus=3 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71972) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 22:12:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:12:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:12:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 7bb0580b-727f-4168-9d56-56dcb4fa404e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:12:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 99cb7131-abb8-41d6-bddd-c3bc943b7678 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:12:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 9634492c-168d-4b49-941a-b89703571b73 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:12:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 97067629-e099-49fd-bb79-223dd4401405 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:12:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance d9386728-7c3f-42ff-8f1c-51748ccefff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:12:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 128ffde0-5149-48d2-a56e-c41418fbc753 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:12:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance e4e13341-82c2-4b86-8b5a-e12d435513ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:12:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 2b53a15a-2e55-4c9e-976b-addb176545fa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:12:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:12:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Total usable vcpus: 12, total allocated vcpus: 9 {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 22:12:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Final resource view: name=user phys_ram=16023MB used_ram=1664MB phys_disk=40GB used_disk=9GB total_vcpus=12 used_vcpus=9 pci_stats=[] {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 22:12:42 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:12:42 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:12:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Compute_service record updated for user:user {{(pid=71972) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 22:12:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.509s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:12:42 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:44 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:12:44 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:12:44 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:12:44 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Starting heal instance info cache {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 22:12:44 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "refresh_cache-9634492c-168d-4b49-941a-b89703571b73" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:12:44 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquired lock "refresh_cache-9634492c-168d-4b49-941a-b89703571b73" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:12:44 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 9634492c-168d-4b49-941a-b89703571b73] Forcefully refreshing network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 17 22:12:45 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 9634492c-168d-4b49-941a-b89703571b73] Updating instance_info_cache with network_info: [{"id": "d22f6b6c-44f0-472e-b05f-192e12d56f32", "address": "fa:16:3e:ab:f9:49", "network": {"id": "bd20802f-022c-4a11-8310-49b2375f642e", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-172869997-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a863c30ce3844f0ba754b048c2833fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd22f6b6c-44", "ovs_interfaceid": "d22f6b6c-44f0-472e-b05f-192e12d56f32", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:12:45 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Releasing lock "refresh_cache-9634492c-168d-4b49-941a-b89703571b73" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:12:45 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 9634492c-168d-4b49-941a-b89703571b73] Updated the network info_cache for instance {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 17 22:12:45 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:12:45 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:12:45 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:12:45 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:12:45 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71972) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 22:12:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:50 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:51 user nova-compute[71972]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:12:51 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] VM Stopped (Lifecycle Event) Apr 17 22:12:51 user nova-compute[71972]: DEBUG nova.compute.manager [None req-81c65c98-ba73-438f-88d5-47683b9e7505 None None] [instance: d7c93039-48e6-40b8-b921-d6eb1ebe78ef] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:12:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:52 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:56 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:12:57 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:01 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:06 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:07 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:12 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:16 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:17 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:21 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:22 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:26 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:27 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:34 user nova-compute[71972]: DEBUG nova.compute.manager [req-d3ad55ef-623a-4112-b338-12fc60788c8e req-99e3554b-773f-487d-bdd7-8b6bdb7dfbe0 service nova] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Received event network-changed-7057b0c3-d9e3-4814-9d2d-2b70e922533b {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:13:34 user nova-compute[71972]: DEBUG nova.compute.manager [req-d3ad55ef-623a-4112-b338-12fc60788c8e req-99e3554b-773f-487d-bdd7-8b6bdb7dfbe0 service nova] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Refreshing instance network info cache due to event network-changed-7057b0c3-d9e3-4814-9d2d-2b70e922533b. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:13:34 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-d3ad55ef-623a-4112-b338-12fc60788c8e req-99e3554b-773f-487d-bdd7-8b6bdb7dfbe0 service nova] Acquiring lock "refresh_cache-128ffde0-5149-48d2-a56e-c41418fbc753" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:13:34 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-d3ad55ef-623a-4112-b338-12fc60788c8e req-99e3554b-773f-487d-bdd7-8b6bdb7dfbe0 service nova] Acquired lock "refresh_cache-128ffde0-5149-48d2-a56e-c41418fbc753" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:13:34 user nova-compute[71972]: DEBUG nova.network.neutron [req-d3ad55ef-623a-4112-b338-12fc60788c8e req-99e3554b-773f-487d-bdd7-8b6bdb7dfbe0 service nova] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Refreshing network info cache for port 7057b0c3-d9e3-4814-9d2d-2b70e922533b {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:13:34 user nova-compute[71972]: DEBUG nova.network.neutron [req-d3ad55ef-623a-4112-b338-12fc60788c8e req-99e3554b-773f-487d-bdd7-8b6bdb7dfbe0 service nova] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Updated VIF entry in instance network info cache for port 7057b0c3-d9e3-4814-9d2d-2b70e922533b. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:13:34 user nova-compute[71972]: DEBUG nova.network.neutron [req-d3ad55ef-623a-4112-b338-12fc60788c8e req-99e3554b-773f-487d-bdd7-8b6bdb7dfbe0 service nova] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Updating instance_info_cache with network_info: [{"id": "7057b0c3-d9e3-4814-9d2d-2b70e922533b", "address": "fa:16:3e:7c:ba:7a", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7057b0c3-d9", "ovs_interfaceid": "7057b0c3-d9e3-4814-9d2d-2b70e922533b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:13:34 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-d3ad55ef-623a-4112-b338-12fc60788c8e req-99e3554b-773f-487d-bdd7-8b6bdb7dfbe0 service nova] Releasing lock "refresh_cache-128ffde0-5149-48d2-a56e-c41418fbc753" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:13:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-73ec08b2-562f-4802-833c-38319ba5fec2 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquiring lock "128ffde0-5149-48d2-a56e-c41418fbc753" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:13:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-73ec08b2-562f-4802-833c-38319ba5fec2 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "128ffde0-5149-48d2-a56e-c41418fbc753" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:13:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-73ec08b2-562f-4802-833c-38319ba5fec2 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquiring lock "128ffde0-5149-48d2-a56e-c41418fbc753-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:13:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-73ec08b2-562f-4802-833c-38319ba5fec2 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "128ffde0-5149-48d2-a56e-c41418fbc753-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:13:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-73ec08b2-562f-4802-833c-38319ba5fec2 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "128ffde0-5149-48d2-a56e-c41418fbc753-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:13:35 user nova-compute[71972]: INFO nova.compute.manager [None req-73ec08b2-562f-4802-833c-38319ba5fec2 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Terminating instance Apr 17 22:13:35 user nova-compute[71972]: DEBUG nova.compute.manager [None req-73ec08b2-562f-4802-833c-38319ba5fec2 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Start destroying the instance on the hypervisor. {{(pid=71972) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 22:13:35 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:35 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:35 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:35 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:35 user nova-compute[71972]: DEBUG nova.compute.manager [req-5a049726-260e-47be-9cc3-ef51d211badd req-51b62574-e614-4e54-ae66-daaed157dbd4 service nova] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Received event network-vif-unplugged-7057b0c3-d9e3-4814-9d2d-2b70e922533b {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:13:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-5a049726-260e-47be-9cc3-ef51d211badd req-51b62574-e614-4e54-ae66-daaed157dbd4 service nova] Acquiring lock "128ffde0-5149-48d2-a56e-c41418fbc753-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:13:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-5a049726-260e-47be-9cc3-ef51d211badd req-51b62574-e614-4e54-ae66-daaed157dbd4 service nova] Lock "128ffde0-5149-48d2-a56e-c41418fbc753-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:13:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-5a049726-260e-47be-9cc3-ef51d211badd req-51b62574-e614-4e54-ae66-daaed157dbd4 service nova] Lock "128ffde0-5149-48d2-a56e-c41418fbc753-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:13:35 user nova-compute[71972]: DEBUG nova.compute.manager [req-5a049726-260e-47be-9cc3-ef51d211badd req-51b62574-e614-4e54-ae66-daaed157dbd4 service nova] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] No waiting events found dispatching network-vif-unplugged-7057b0c3-d9e3-4814-9d2d-2b70e922533b {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:13:35 user nova-compute[71972]: DEBUG nova.compute.manager [req-5a049726-260e-47be-9cc3-ef51d211badd req-51b62574-e614-4e54-ae66-daaed157dbd4 service nova] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Received event network-vif-unplugged-7057b0c3-d9e3-4814-9d2d-2b70e922533b for instance with task_state deleting. {{(pid=71972) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 22:13:36 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Instance destroyed successfully. Apr 17 22:13:36 user nova-compute[71972]: DEBUG nova.objects.instance [None req-73ec08b2-562f-4802-833c-38319ba5fec2 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lazy-loading 'resources' on Instance uuid 128ffde0-5149-48d2-a56e-c41418fbc753 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:13:36 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-73ec08b2-562f-4802-833c-38319ba5fec2 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:11:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-494563557',display_name='tempest-AttachVolumeNegativeTest-server-494563557',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-494563557',id=13,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCCM4zdhbyTNqZazXlr1WLqme2C0HFJDjVxfYGQZTlRtIWULGYITIT3EQ1q/6k7xhD3mm3oc0QfKdmb3gJ1SjZzP4d00vGPpTXvENVjyWcQGVsW5qXE+WwAnTojbreOxhA==',key_name='tempest-keypair-682999879',keypairs=,launch_index=0,launched_at=2023-04-17T22:11:49Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='e52724ed9bc54905bd5eddd8504e4c77',ramdisk_id='',reservation_id='r-dzjheaur',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-1678965362',owner_user_name='tempest-AttachVolumeNegativeTest-1678965362-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T22:11:49Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='52b3e35c03b54ae4b5dabfb1325886a9',uuid=128ffde0-5149-48d2-a56e-c41418fbc753,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7057b0c3-d9e3-4814-9d2d-2b70e922533b", "address": "fa:16:3e:7c:ba:7a", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7057b0c3-d9", "ovs_interfaceid": "7057b0c3-d9e3-4814-9d2d-2b70e922533b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 22:13:36 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-73ec08b2-562f-4802-833c-38319ba5fec2 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Converting VIF {"id": "7057b0c3-d9e3-4814-9d2d-2b70e922533b", "address": "fa:16:3e:7c:ba:7a", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7057b0c3-d9", "ovs_interfaceid": "7057b0c3-d9e3-4814-9d2d-2b70e922533b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:13:36 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-73ec08b2-562f-4802-833c-38319ba5fec2 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7c:ba:7a,bridge_name='br-int',has_traffic_filtering=True,id=7057b0c3-d9e3-4814-9d2d-2b70e922533b,network=Network(bd9d53b7-0e22-42c4-8b91-49575cadf44f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7057b0c3-d9') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:13:36 user nova-compute[71972]: DEBUG os_vif [None req-73ec08b2-562f-4802-833c-38319ba5fec2 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:ba:7a,bridge_name='br-int',has_traffic_filtering=True,id=7057b0c3-d9e3-4814-9d2d-2b70e922533b,network=Network(bd9d53b7-0e22-42c4-8b91-49575cadf44f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7057b0c3-d9') {{(pid=71972) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 22:13:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7057b0c3-d9, bridge=br-int, if_exists=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:13:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:13:36 user nova-compute[71972]: INFO os_vif [None req-73ec08b2-562f-4802-833c-38319ba5fec2 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:ba:7a,bridge_name='br-int',has_traffic_filtering=True,id=7057b0c3-d9e3-4814-9d2d-2b70e922533b,network=Network(bd9d53b7-0e22-42c4-8b91-49575cadf44f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7057b0c3-d9') Apr 17 22:13:36 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-73ec08b2-562f-4802-833c-38319ba5fec2 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Deleting instance files /opt/stack/data/nova/instances/128ffde0-5149-48d2-a56e-c41418fbc753_del Apr 17 22:13:36 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-73ec08b2-562f-4802-833c-38319ba5fec2 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Deletion of /opt/stack/data/nova/instances/128ffde0-5149-48d2-a56e-c41418fbc753_del complete Apr 17 22:13:36 user nova-compute[71972]: INFO nova.compute.manager [None req-73ec08b2-562f-4802-833c-38319ba5fec2 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Took 0.88 seconds to destroy the instance on the hypervisor. Apr 17 22:13:36 user nova-compute[71972]: DEBUG oslo.service.loopingcall [None req-73ec08b2-562f-4802-833c-38319ba5fec2 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71972) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 22:13:36 user nova-compute[71972]: DEBUG nova.compute.manager [-] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Deallocating network for instance {{(pid=71972) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 22:13:36 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] deallocate_for_instance() {{(pid=71972) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 22:13:36 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:13:36 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:13:36 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:13:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:13:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:13:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:13:36 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Auditing locally available compute resources for user (node: user) {{(pid=71972) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 22:13:36 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:13:36 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/disk --force-share --output=json" returned: 0 in 0.172s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:13:36 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:13:37 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:13:37 user nova-compute[71972]: DEBUG nova.compute.manager [req-991b9899-ace3-4dd7-a01d-c6bcf29d2494 req-5ac493a7-ec1b-4f04-bcec-88a6dee6fc7a service nova] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Received event network-vif-deleted-7057b0c3-d9e3-4814-9d2d-2b70e922533b {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:13:37 user nova-compute[71972]: INFO nova.compute.manager [req-991b9899-ace3-4dd7-a01d-c6bcf29d2494 req-5ac493a7-ec1b-4f04-bcec-88a6dee6fc7a service nova] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Neutron deleted interface 7057b0c3-d9e3-4814-9d2d-2b70e922533b; detaching it from the instance and deleting it from the info cache Apr 17 22:13:37 user nova-compute[71972]: DEBUG nova.network.neutron [req-991b9899-ace3-4dd7-a01d-c6bcf29d2494 req-5ac493a7-ec1b-4f04-bcec-88a6dee6fc7a service nova] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:13:37 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Took 0.68 seconds to deallocate network for instance. Apr 17 22:13:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:13:37 user nova-compute[71972]: DEBUG nova.compute.manager [req-991b9899-ace3-4dd7-a01d-c6bcf29d2494 req-5ac493a7-ec1b-4f04-bcec-88a6dee6fc7a service nova] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Detach interface failed, port_id=7057b0c3-d9e3-4814-9d2d-2b70e922533b, reason: Instance 128ffde0-5149-48d2-a56e-c41418fbc753 could not be found. {{(pid=71972) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 17 22:13:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e4e13341-82c2-4b86-8b5a-e12d435513ee/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:13:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-73ec08b2-562f-4802-833c-38319ba5fec2 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:13:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-73ec08b2-562f-4802-833c-38319ba5fec2 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:13:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e4e13341-82c2-4b86-8b5a-e12d435513ee/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:13:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e4e13341-82c2-4b86-8b5a-e12d435513ee/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:13:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/e4e13341-82c2-4b86-8b5a-e12d435513ee/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:13:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk.rescue --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:13:37 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-73ec08b2-562f-4802-833c-38319ba5fec2 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:13:37 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-73ec08b2-562f-4802-833c-38319ba5fec2 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:13:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk.rescue --force-share --output=json" returned: 0 in 0.146s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:13:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk.rescue --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:13:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-73ec08b2-562f-4802-833c-38319ba5fec2 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.424s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:13:37 user nova-compute[71972]: INFO nova.scheduler.client.report [None req-73ec08b2-562f-4802-833c-38319ba5fec2 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Deleted allocations for instance 128ffde0-5149-48d2-a56e-c41418fbc753 Apr 17 22:13:37 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-73ec08b2-562f-4802-833c-38319ba5fec2 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "128ffde0-5149-48d2-a56e-c41418fbc753" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.192s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:13:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk.rescue --force-share --output=json" returned: 0 in 0.168s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:13:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:13:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk --force-share --output=json" returned: 0 in 0.148s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:13:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:13:37 user nova-compute[71972]: DEBUG nova.compute.manager [req-78a7fdd2-6624-4c43-8de7-1c5084b19b46 req-634f4a16-4a65-4282-9445-fd6dfa3c323f service nova] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Received event network-vif-plugged-7057b0c3-d9e3-4814-9d2d-2b70e922533b {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:13:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-78a7fdd2-6624-4c43-8de7-1c5084b19b46 req-634f4a16-4a65-4282-9445-fd6dfa3c323f service nova] Acquiring lock "128ffde0-5149-48d2-a56e-c41418fbc753-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:13:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-78a7fdd2-6624-4c43-8de7-1c5084b19b46 req-634f4a16-4a65-4282-9445-fd6dfa3c323f service nova] Lock "128ffde0-5149-48d2-a56e-c41418fbc753-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:13:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-78a7fdd2-6624-4c43-8de7-1c5084b19b46 req-634f4a16-4a65-4282-9445-fd6dfa3c323f service nova] Lock "128ffde0-5149-48d2-a56e-c41418fbc753-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:13:37 user nova-compute[71972]: DEBUG nova.compute.manager [req-78a7fdd2-6624-4c43-8de7-1c5084b19b46 req-634f4a16-4a65-4282-9445-fd6dfa3c323f service nova] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] No waiting events found dispatching network-vif-plugged-7057b0c3-d9e3-4814-9d2d-2b70e922533b {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:13:37 user nova-compute[71972]: WARNING nova.compute.manager [req-78a7fdd2-6624-4c43-8de7-1c5084b19b46 req-634f4a16-4a65-4282-9445-fd6dfa3c323f service nova] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Received unexpected event network-vif-plugged-7057b0c3-d9e3-4814-9d2d-2b70e922533b for instance with vm_state deleted and task_state None. Apr 17 22:13:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:13:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2b53a15a-2e55-4c9e-976b-addb176545fa/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:13:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2b53a15a-2e55-4c9e-976b-addb176545fa/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:13:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2b53a15a-2e55-4c9e-976b-addb176545fa/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:13:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/2b53a15a-2e55-4c9e-976b-addb176545fa/disk --force-share --output=json" returned: 0 in 0.150s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:13:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:13:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:13:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:13:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:13:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:13:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json" returned: 0 in 0.127s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:13:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:13:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json" returned: 0 in 0.131s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:13:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d9386728-7c3f-42ff-8f1c-51748ccefff3/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:13:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d9386728-7c3f-42ff-8f1c-51748ccefff3/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:13:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d9386728-7c3f-42ff-8f1c-51748ccefff3/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:13:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d9386728-7c3f-42ff-8f1c-51748ccefff3/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:13:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:13:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json" returned: 0 in 0.146s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:13:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:13:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7511374a-ab6f-4e67-9bb7-ae05dff2452a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Acquiring lock "e4e13341-82c2-4b86-8b5a-e12d435513ee" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:13:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7511374a-ab6f-4e67-9bb7-ae05dff2452a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "e4e13341-82c2-4b86-8b5a-e12d435513ee" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.002s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:13:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7511374a-ab6f-4e67-9bb7-ae05dff2452a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Acquiring lock "e4e13341-82c2-4b86-8b5a-e12d435513ee-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:13:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7511374a-ab6f-4e67-9bb7-ae05dff2452a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "e4e13341-82c2-4b86-8b5a-e12d435513ee-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:13:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7511374a-ab6f-4e67-9bb7-ae05dff2452a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "e4e13341-82c2-4b86-8b5a-e12d435513ee-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:13:39 user nova-compute[71972]: INFO nova.compute.manager [None req-7511374a-ab6f-4e67-9bb7-ae05dff2452a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Terminating instance Apr 17 22:13:39 user nova-compute[71972]: DEBUG nova.compute.manager [None req-7511374a-ab6f-4e67-9bb7-ae05dff2452a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Start destroying the instance on the hypervisor. {{(pid=71972) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 22:13:39 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG nova.compute.manager [req-2dfd895e-2c17-437f-ba65-1437f9aec9b6 req-2ece0586-33cd-47b8-9ec1-e81d364cc9f9 service nova] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Received event network-vif-unplugged-ea831325-55ac-45ca-ab26-4b424e66ca77 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-2dfd895e-2c17-437f-ba65-1437f9aec9b6 req-2ece0586-33cd-47b8-9ec1-e81d364cc9f9 service nova] Acquiring lock "e4e13341-82c2-4b86-8b5a-e12d435513ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-2dfd895e-2c17-437f-ba65-1437f9aec9b6 req-2ece0586-33cd-47b8-9ec1-e81d364cc9f9 service nova] Lock "e4e13341-82c2-4b86-8b5a-e12d435513ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-2dfd895e-2c17-437f-ba65-1437f9aec9b6 req-2ece0586-33cd-47b8-9ec1-e81d364cc9f9 service nova] Lock "e4e13341-82c2-4b86-8b5a-e12d435513ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG nova.compute.manager [req-2dfd895e-2c17-437f-ba65-1437f9aec9b6 req-2ece0586-33cd-47b8-9ec1-e81d364cc9f9 service nova] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] No waiting events found dispatching network-vif-unplugged-ea831325-55ac-45ca-ab26-4b424e66ca77 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG nova.compute.manager [req-2dfd895e-2c17-437f-ba65-1437f9aec9b6 req-2ece0586-33cd-47b8-9ec1-e81d364cc9f9 service nova] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Received event network-vif-unplugged-ea831325-55ac-45ca-ab26-4b424e66ca77 for instance with task_state deleting. {{(pid=71972) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 22:13:40 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Instance destroyed successfully. Apr 17 22:13:40 user nova-compute[71972]: DEBUG nova.objects.instance [None req-7511374a-ab6f-4e67-9bb7-ae05dff2452a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lazy-loading 'resources' on Instance uuid e4e13341-82c2-4b86-8b5a-e12d435513ee {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:13:40 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:13:40 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:13:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Hypervisor/Node resource view: name=user free_ram=8238MB free_disk=26.487071990966797GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71972) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-7511374a-ab6f-4e67-9bb7-ae05dff2452a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:11:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1560519322',display_name='tempest-VolumesAdminNegativeTest-server-1560519322',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-1560519322',id=14,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-17T22:11:54Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='416fcd7cd2bc486884f751acab268fd8',ramdisk_id='',reservation_id='r-eu7szm24',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesAdminNegativeTest-315022411',owner_user_name='tempest-VolumesAdminNegativeTest-315022411-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T22:11:55Z,user_data=None,user_id='70af0dc4dbf24ae1add76f3c87f8b1b5',uuid=e4e13341-82c2-4b86-8b5a-e12d435513ee,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ea831325-55ac-45ca-ab26-4b424e66ca77", "address": "fa:16:3e:c5:43:1e", "network": {"id": "83a63ee2-d2cb-414d-bab9-c556ee1c2c88", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-625420379-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "416fcd7cd2bc486884f751acab268fd8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapea831325-55", "ovs_interfaceid": "ea831325-55ac-45ca-ab26-4b424e66ca77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-7511374a-ab6f-4e67-9bb7-ae05dff2452a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Converting VIF {"id": "ea831325-55ac-45ca-ab26-4b424e66ca77", "address": "fa:16:3e:c5:43:1e", "network": {"id": "83a63ee2-d2cb-414d-bab9-c556ee1c2c88", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-625420379-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "416fcd7cd2bc486884f751acab268fd8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapea831325-55", "ovs_interfaceid": "ea831325-55ac-45ca-ab26-4b424e66ca77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-7511374a-ab6f-4e67-9bb7-ae05dff2452a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c5:43:1e,bridge_name='br-int',has_traffic_filtering=True,id=ea831325-55ac-45ca-ab26-4b424e66ca77,network=Network(83a63ee2-d2cb-414d-bab9-c556ee1c2c88),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea831325-55') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG os_vif [None req-7511374a-ab6f-4e67-9bb7-ae05dff2452a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:43:1e,bridge_name='br-int',has_traffic_filtering=True,id=ea831325-55ac-45ca-ab26-4b424e66ca77,network=Network(83a63ee2-d2cb-414d-bab9-c556ee1c2c88),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea831325-55') {{(pid=71972) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapea831325-55, bridge=br-int, if_exists=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:13:40 user nova-compute[71972]: INFO os_vif [None req-7511374a-ab6f-4e67-9bb7-ae05dff2452a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:43:1e,bridge_name='br-int',has_traffic_filtering=True,id=ea831325-55ac-45ca-ab26-4b424e66ca77,network=Network(83a63ee2-d2cb-414d-bab9-c556ee1c2c88),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea831325-55') Apr 17 22:13:40 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-7511374a-ab6f-4e67-9bb7-ae05dff2452a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Deleting instance files /opt/stack/data/nova/instances/e4e13341-82c2-4b86-8b5a-e12d435513ee_del Apr 17 22:13:40 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-7511374a-ab6f-4e67-9bb7-ae05dff2452a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Deletion of /opt/stack/data/nova/instances/e4e13341-82c2-4b86-8b5a-e12d435513ee_del complete Apr 17 22:13:40 user nova-compute[71972]: INFO nova.compute.manager [None req-7511374a-ab6f-4e67-9bb7-ae05dff2452a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Took 1.05 seconds to destroy the instance on the hypervisor. Apr 17 22:13:40 user nova-compute[71972]: DEBUG oslo.service.loopingcall [None req-7511374a-ab6f-4e67-9bb7-ae05dff2452a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71972) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG nova.compute.manager [-] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Deallocating network for instance {{(pid=71972) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] deallocate_for_instance() {{(pid=71972) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 7bb0580b-727f-4168-9d56-56dcb4fa404e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 99cb7131-abb8-41d6-bddd-c3bc943b7678 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 9634492c-168d-4b49-941a-b89703571b73 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 97067629-e099-49fd-bb79-223dd4401405 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance d9386728-7c3f-42ff-8f1c-51748ccefff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance e4e13341-82c2-4b86-8b5a-e12d435513ee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 2b53a15a-2e55-4c9e-976b-addb176545fa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Total usable vcpus: 12, total allocated vcpus: 8 {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Final resource view: name=user phys_ram=16023MB used_ram=1536MB phys_disk=40GB used_disk=8GB total_vcpus=12 used_vcpus=8 pci_stats=[] {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Compute_service record updated for user:user {{(pid=71972) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 22:13:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.416s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:13:41 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:13:41 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Took 0.83 seconds to deallocate network for instance. Apr 17 22:13:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7511374a-ab6f-4e67-9bb7-ae05dff2452a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:13:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7511374a-ab6f-4e67-9bb7-ae05dff2452a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:13:41 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-7511374a-ab6f-4e67-9bb7-ae05dff2452a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:13:41 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-7511374a-ab6f-4e67-9bb7-ae05dff2452a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:13:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7511374a-ab6f-4e67-9bb7-ae05dff2452a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.257s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:13:41 user nova-compute[71972]: INFO nova.scheduler.client.report [None req-7511374a-ab6f-4e67-9bb7-ae05dff2452a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Deleted allocations for instance e4e13341-82c2-4b86-8b5a-e12d435513ee Apr 17 22:13:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7511374a-ab6f-4e67-9bb7-ae05dff2452a tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "e4e13341-82c2-4b86-8b5a-e12d435513ee" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.409s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:13:41 user nova-compute[71972]: DEBUG nova.compute.manager [req-855f0f5c-16bc-4f38-8602-1c649fa665ce req-46e05d9b-11cf-4c0e-9537-3ae99adf0e2e service nova] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Received event network-vif-plugged-ea831325-55ac-45ca-ab26-4b424e66ca77 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:13:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-855f0f5c-16bc-4f38-8602-1c649fa665ce req-46e05d9b-11cf-4c0e-9537-3ae99adf0e2e service nova] Acquiring lock "e4e13341-82c2-4b86-8b5a-e12d435513ee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:13:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-855f0f5c-16bc-4f38-8602-1c649fa665ce req-46e05d9b-11cf-4c0e-9537-3ae99adf0e2e service nova] Lock "e4e13341-82c2-4b86-8b5a-e12d435513ee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:13:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-855f0f5c-16bc-4f38-8602-1c649fa665ce req-46e05d9b-11cf-4c0e-9537-3ae99adf0e2e service nova] Lock "e4e13341-82c2-4b86-8b5a-e12d435513ee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:13:41 user nova-compute[71972]: DEBUG nova.compute.manager [req-855f0f5c-16bc-4f38-8602-1c649fa665ce req-46e05d9b-11cf-4c0e-9537-3ae99adf0e2e service nova] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] No waiting events found dispatching network-vif-plugged-ea831325-55ac-45ca-ab26-4b424e66ca77 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:13:41 user nova-compute[71972]: WARNING nova.compute.manager [req-855f0f5c-16bc-4f38-8602-1c649fa665ce req-46e05d9b-11cf-4c0e-9537-3ae99adf0e2e service nova] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Received unexpected event network-vif-plugged-ea831325-55ac-45ca-ab26-4b424e66ca77 for instance with vm_state deleted and task_state None. Apr 17 22:13:41 user nova-compute[71972]: DEBUG nova.compute.manager [req-855f0f5c-16bc-4f38-8602-1c649fa665ce req-46e05d9b-11cf-4c0e-9537-3ae99adf0e2e service nova] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Received event network-vif-deleted-ea831325-55ac-45ca-ab26-4b424e66ca77 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:13:42 user nova-compute[71972]: DEBUG nova.compute.manager [req-cf4e6d6c-5a67-4d95-b2c2-46642586ce5d req-5e5df66c-82b0-49b1-bec0-8493b40e5db4 service nova] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Received event network-changed-2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:13:42 user nova-compute[71972]: DEBUG nova.compute.manager [req-cf4e6d6c-5a67-4d95-b2c2-46642586ce5d req-5e5df66c-82b0-49b1-bec0-8493b40e5db4 service nova] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Refreshing instance network info cache due to event network-changed-2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:13:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-cf4e6d6c-5a67-4d95-b2c2-46642586ce5d req-5e5df66c-82b0-49b1-bec0-8493b40e5db4 service nova] Acquiring lock "refresh_cache-2b53a15a-2e55-4c9e-976b-addb176545fa" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:13:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-cf4e6d6c-5a67-4d95-b2c2-46642586ce5d req-5e5df66c-82b0-49b1-bec0-8493b40e5db4 service nova] Acquired lock "refresh_cache-2b53a15a-2e55-4c9e-976b-addb176545fa" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:13:42 user nova-compute[71972]: DEBUG nova.network.neutron [req-cf4e6d6c-5a67-4d95-b2c2-46642586ce5d req-5e5df66c-82b0-49b1-bec0-8493b40e5db4 service nova] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Refreshing network info cache for port 2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7 {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:13:42 user nova-compute[71972]: DEBUG nova.network.neutron [req-cf4e6d6c-5a67-4d95-b2c2-46642586ce5d req-5e5df66c-82b0-49b1-bec0-8493b40e5db4 service nova] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Updated VIF entry in instance network info cache for port 2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:13:42 user nova-compute[71972]: DEBUG nova.network.neutron [req-cf4e6d6c-5a67-4d95-b2c2-46642586ce5d req-5e5df66c-82b0-49b1-bec0-8493b40e5db4 service nova] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Updating instance_info_cache with network_info: [{"id": "2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7", "address": "fa:16:3e:0a:f9:e7", "network": {"id": "2aae2552-3ef8-41d3-84e6-313da6fc203b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1626967335-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9283fe3c9a094f9bbddb08e48973da44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2515c5ff-fa", "ovs_interfaceid": "2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:13:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-cf4e6d6c-5a67-4d95-b2c2-46642586ce5d req-5e5df66c-82b0-49b1-bec0-8493b40e5db4 service nova] Releasing lock "refresh_cache-2b53a15a-2e55-4c9e-976b-addb176545fa" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:13:42 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:42 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:13:42 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:13:42 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Starting heal instance info cache {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 22:13:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "refresh_cache-97067629-e099-49fd-bb79-223dd4401405" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:13:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquired lock "refresh_cache-97067629-e099-49fd-bb79-223dd4401405" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:13:42 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 97067629-e099-49fd-bb79-223dd4401405] Forcefully refreshing network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 17 22:13:43 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 97067629-e099-49fd-bb79-223dd4401405] Updating instance_info_cache with network_info: [{"id": "63dc9a41-e89e-4673-a658-7acddd88706f", "address": "fa:16:3e:3e:d4:f0", "network": {"id": "bd20802f-022c-4a11-8310-49b2375f642e", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-172869997-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a863c30ce3844f0ba754b048c2833fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap63dc9a41-e8", "ovs_interfaceid": "63dc9a41-e89e-4673-a658-7acddd88706f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:13:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Releasing lock "refresh_cache-97067629-e099-49fd-bb79-223dd4401405" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:13:43 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 97067629-e099-49fd-bb79-223dd4401405] Updated the network info_cache for instance {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 17 22:13:43 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:13:43 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:13:43 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:13:43 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:13:43 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71972) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 22:13:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-4e420790-2902-4116-a6f6-b680255b2492 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Acquiring lock "2b53a15a-2e55-4c9e-976b-addb176545fa" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:13:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-4e420790-2902-4116-a6f6-b680255b2492 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "2b53a15a-2e55-4c9e-976b-addb176545fa" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:13:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-4e420790-2902-4116-a6f6-b680255b2492 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Acquiring lock "2b53a15a-2e55-4c9e-976b-addb176545fa-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:13:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-4e420790-2902-4116-a6f6-b680255b2492 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "2b53a15a-2e55-4c9e-976b-addb176545fa-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:13:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-4e420790-2902-4116-a6f6-b680255b2492 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "2b53a15a-2e55-4c9e-976b-addb176545fa-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:13:43 user nova-compute[71972]: INFO nova.compute.manager [None req-4e420790-2902-4116-a6f6-b680255b2492 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Terminating instance Apr 17 22:13:43 user nova-compute[71972]: DEBUG nova.compute.manager [None req-4e420790-2902-4116-a6f6-b680255b2492 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Start destroying the instance on the hypervisor. {{(pid=71972) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 22:13:43 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:43 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:43 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:43 user nova-compute[71972]: DEBUG nova.compute.manager [req-88b6b1f3-d296-477b-8688-c05925a09c4f req-77832864-d072-4ad0-883e-527531e008ec service nova] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Received event network-vif-unplugged-2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:13:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-88b6b1f3-d296-477b-8688-c05925a09c4f req-77832864-d072-4ad0-883e-527531e008ec service nova] Acquiring lock "2b53a15a-2e55-4c9e-976b-addb176545fa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:13:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-88b6b1f3-d296-477b-8688-c05925a09c4f req-77832864-d072-4ad0-883e-527531e008ec service nova] Lock "2b53a15a-2e55-4c9e-976b-addb176545fa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:13:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-88b6b1f3-d296-477b-8688-c05925a09c4f req-77832864-d072-4ad0-883e-527531e008ec service nova] Lock "2b53a15a-2e55-4c9e-976b-addb176545fa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:13:43 user nova-compute[71972]: DEBUG nova.compute.manager [req-88b6b1f3-d296-477b-8688-c05925a09c4f req-77832864-d072-4ad0-883e-527531e008ec service nova] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] No waiting events found dispatching network-vif-unplugged-2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:13:43 user nova-compute[71972]: DEBUG nova.compute.manager [req-88b6b1f3-d296-477b-8688-c05925a09c4f req-77832864-d072-4ad0-883e-527531e008ec service nova] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Received event network-vif-unplugged-2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7 for instance with task_state deleting. {{(pid=71972) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 22:13:44 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Instance destroyed successfully. Apr 17 22:13:44 user nova-compute[71972]: DEBUG nova.objects.instance [None req-4e420790-2902-4116-a6f6-b680255b2492 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lazy-loading 'resources' on Instance uuid 2b53a15a-2e55-4c9e-976b-addb176545fa {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:13:44 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-4e420790-2902-4116-a6f6-b680255b2492 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:11:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-742045578',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumeshelvetestjson-server-742045578',id=15,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPQR3wOyHdbetUVTI8L6ivA4S33oxQgPScnR6ThbG577rtQzESbVwvoSv8WcOg2mDZIkUxurn2f81Gs5LsSgI5VUppArPBavq2+Gv6ZDUuSwGAQdnLviswAseye+/hTX7A==',key_name='tempest-keypair-41502610',keypairs=,launch_index=0,launched_at=2023-04-17T22:11:57Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='9283fe3c9a094f9bbddb08e48973da44',ramdisk_id='',reservation_id='r-a0ylbpui',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeShelveTestJSON-1051644628',owner_user_name='tempest-AttachVolumeShelveTestJSON-1051644628-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T22:11:57Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a5ec05fe7d7244368c7eec3739a96c19',uuid=2b53a15a-2e55-4c9e-976b-addb176545fa,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7", "address": "fa:16:3e:0a:f9:e7", "network": {"id": "2aae2552-3ef8-41d3-84e6-313da6fc203b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1626967335-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9283fe3c9a094f9bbddb08e48973da44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2515c5ff-fa", "ovs_interfaceid": "2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 22:13:44 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-4e420790-2902-4116-a6f6-b680255b2492 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Converting VIF {"id": "2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7", "address": "fa:16:3e:0a:f9:e7", "network": {"id": "2aae2552-3ef8-41d3-84e6-313da6fc203b", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1626967335-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "9283fe3c9a094f9bbddb08e48973da44", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2515c5ff-fa", "ovs_interfaceid": "2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:13:44 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-4e420790-2902-4116-a6f6-b680255b2492 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0a:f9:e7,bridge_name='br-int',has_traffic_filtering=True,id=2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7,network=Network(2aae2552-3ef8-41d3-84e6-313da6fc203b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2515c5ff-fa') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:13:44 user nova-compute[71972]: DEBUG os_vif [None req-4e420790-2902-4116-a6f6-b680255b2492 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0a:f9:e7,bridge_name='br-int',has_traffic_filtering=True,id=2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7,network=Network(2aae2552-3ef8-41d3-84e6-313da6fc203b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2515c5ff-fa') {{(pid=71972) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 22:13:44 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:44 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2515c5ff-fa, bridge=br-int, if_exists=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:13:44 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:44 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:13:44 user nova-compute[71972]: INFO os_vif [None req-4e420790-2902-4116-a6f6-b680255b2492 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0a:f9:e7,bridge_name='br-int',has_traffic_filtering=True,id=2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7,network=Network(2aae2552-3ef8-41d3-84e6-313da6fc203b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2515c5ff-fa') Apr 17 22:13:44 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-4e420790-2902-4116-a6f6-b680255b2492 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Deleting instance files /opt/stack/data/nova/instances/2b53a15a-2e55-4c9e-976b-addb176545fa_del Apr 17 22:13:44 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-4e420790-2902-4116-a6f6-b680255b2492 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Deletion of /opt/stack/data/nova/instances/2b53a15a-2e55-4c9e-976b-addb176545fa_del complete Apr 17 22:13:44 user nova-compute[71972]: INFO nova.compute.manager [None req-4e420790-2902-4116-a6f6-b680255b2492 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Took 0.85 seconds to destroy the instance on the hypervisor. Apr 17 22:13:44 user nova-compute[71972]: DEBUG oslo.service.loopingcall [None req-4e420790-2902-4116-a6f6-b680255b2492 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71972) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 22:13:44 user nova-compute[71972]: DEBUG nova.compute.manager [-] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Deallocating network for instance {{(pid=71972) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 22:13:44 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] deallocate_for_instance() {{(pid=71972) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 22:13:44 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-e41ea244-5f47-4ec8-856c-98e4054e4a66 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Acquiring lock "97067629-e099-49fd-bb79-223dd4401405" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:13:44 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-e41ea244-5f47-4ec8-856c-98e4054e4a66 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "97067629-e099-49fd-bb79-223dd4401405" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:13:44 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-e41ea244-5f47-4ec8-856c-98e4054e4a66 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Acquiring lock "97067629-e099-49fd-bb79-223dd4401405-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:13:44 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-e41ea244-5f47-4ec8-856c-98e4054e4a66 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "97067629-e099-49fd-bb79-223dd4401405-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:13:44 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-e41ea244-5f47-4ec8-856c-98e4054e4a66 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "97067629-e099-49fd-bb79-223dd4401405-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:13:44 user nova-compute[71972]: INFO nova.compute.manager [None req-e41ea244-5f47-4ec8-856c-98e4054e4a66 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Terminating instance Apr 17 22:13:44 user nova-compute[71972]: DEBUG nova.compute.manager [None req-e41ea244-5f47-4ec8-856c-98e4054e4a66 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Start destroying the instance on the hypervisor. {{(pid=71972) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:13:45 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Took 0.74 seconds to deallocate network for instance. Apr 17 22:13:45 user nova-compute[71972]: DEBUG nova.compute.manager [req-c8128841-03c7-49e3-89fa-fab8e724ff81 req-7f452035-fc16-4130-a95f-80cb149bb47f service nova] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Received event network-vif-deleted-2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-4e420790-2902-4116-a6f6-b680255b2492 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-4e420790-2902-4116-a6f6-b680255b2492 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-4e420790-2902-4116-a6f6-b680255b2492 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-4e420790-2902-4116-a6f6-b680255b2492 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-4e420790-2902-4116-a6f6-b680255b2492 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.249s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:13:45 user nova-compute[71972]: INFO nova.scheduler.client.report [None req-4e420790-2902-4116-a6f6-b680255b2492 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Deleted allocations for instance 2b53a15a-2e55-4c9e-976b-addb176545fa Apr 17 22:13:45 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-4e420790-2902-4116-a6f6-b680255b2492 tempest-AttachVolumeShelveTestJSON-1051644628 tempest-AttachVolumeShelveTestJSON-1051644628-project-member] Lock "2b53a15a-2e55-4c9e-976b-addb176545fa" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.028s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:13:45 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: 97067629-e099-49fd-bb79-223dd4401405] Instance destroyed successfully. Apr 17 22:13:45 user nova-compute[71972]: DEBUG nova.objects.instance [None req-e41ea244-5f47-4ec8-856c-98e4054e4a66 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lazy-loading 'resources' on Instance uuid 97067629-e099-49fd-bb79-223dd4401405 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-e41ea244-5f47-4ec8-856c-98e4054e4a66 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:08:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-1698566818',display_name='tempest-ServerRescueNegativeTestJSON-server-1698566818',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-1698566818',id=6,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-17T22:10:55Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='a863c30ce3844f0ba754b048c2833fa3',ramdisk_id='',reservation_id='r-x70r9ud0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerRescueNegativeTestJSON-2008986942',owner_user_name='tempest-ServerRescueNegativeTestJSON-2008986942-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T22:10:56Z,user_data=None,user_id='b3e3003057e7456c933b762412442a3e',uuid=97067629-e099-49fd-bb79-223dd4401405,vcpu_model=,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "63dc9a41-e89e-4673-a658-7acddd88706f", "address": "fa:16:3e:3e:d4:f0", "network": {"id": "bd20802f-022c-4a11-8310-49b2375f642e", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-172869997-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a863c30ce3844f0ba754b048c2833fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap63dc9a41-e8", "ovs_interfaceid": "63dc9a41-e89e-4673-a658-7acddd88706f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-e41ea244-5f47-4ec8-856c-98e4054e4a66 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Converting VIF {"id": "63dc9a41-e89e-4673-a658-7acddd88706f", "address": "fa:16:3e:3e:d4:f0", "network": {"id": "bd20802f-022c-4a11-8310-49b2375f642e", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-172869997-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a863c30ce3844f0ba754b048c2833fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap63dc9a41-e8", "ovs_interfaceid": "63dc9a41-e89e-4673-a658-7acddd88706f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-e41ea244-5f47-4ec8-856c-98e4054e4a66 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3e:d4:f0,bridge_name='br-int',has_traffic_filtering=True,id=63dc9a41-e89e-4673-a658-7acddd88706f,network=Network(bd20802f-022c-4a11-8310-49b2375f642e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63dc9a41-e8') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG os_vif [None req-e41ea244-5f47-4ec8-856c-98e4054e4a66 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3e:d4:f0,bridge_name='br-int',has_traffic_filtering=True,id=63dc9a41-e89e-4673-a658-7acddd88706f,network=Network(bd20802f-022c-4a11-8310-49b2375f642e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63dc9a41-e8') {{(pid=71972) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap63dc9a41-e8, bridge=br-int, if_exists=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:45 user nova-compute[71972]: INFO os_vif [None req-e41ea244-5f47-4ec8-856c-98e4054e4a66 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3e:d4:f0,bridge_name='br-int',has_traffic_filtering=True,id=63dc9a41-e89e-4673-a658-7acddd88706f,network=Network(bd20802f-022c-4a11-8310-49b2375f642e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63dc9a41-e8') Apr 17 22:13:45 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-e41ea244-5f47-4ec8-856c-98e4054e4a66 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Deleting instance files /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405_del Apr 17 22:13:45 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-e41ea244-5f47-4ec8-856c-98e4054e4a66 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Deletion of /opt/stack/data/nova/instances/97067629-e099-49fd-bb79-223dd4401405_del complete Apr 17 22:13:45 user nova-compute[71972]: INFO nova.compute.manager [None req-e41ea244-5f47-4ec8-856c-98e4054e4a66 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 97067629-e099-49fd-bb79-223dd4401405] Took 0.68 seconds to destroy the instance on the hypervisor. Apr 17 22:13:45 user nova-compute[71972]: DEBUG oslo.service.loopingcall [None req-e41ea244-5f47-4ec8-856c-98e4054e4a66 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71972) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG nova.compute.manager [-] [instance: 97067629-e099-49fd-bb79-223dd4401405] Deallocating network for instance {{(pid=71972) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: 97067629-e099-49fd-bb79-223dd4401405] deallocate_for_instance() {{(pid=71972) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG nova.compute.manager [req-bf510839-e991-4051-ad35-72a32802809f req-0b1c7c29-7e35-47cf-a438-4c540a692d42 service nova] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Received event network-vif-plugged-2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-bf510839-e991-4051-ad35-72a32802809f req-0b1c7c29-7e35-47cf-a438-4c540a692d42 service nova] Acquiring lock "2b53a15a-2e55-4c9e-976b-addb176545fa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-bf510839-e991-4051-ad35-72a32802809f req-0b1c7c29-7e35-47cf-a438-4c540a692d42 service nova] Lock "2b53a15a-2e55-4c9e-976b-addb176545fa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-bf510839-e991-4051-ad35-72a32802809f req-0b1c7c29-7e35-47cf-a438-4c540a692d42 service nova] Lock "2b53a15a-2e55-4c9e-976b-addb176545fa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG nova.compute.manager [req-bf510839-e991-4051-ad35-72a32802809f req-0b1c7c29-7e35-47cf-a438-4c540a692d42 service nova] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] No waiting events found dispatching network-vif-plugged-2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:13:45 user nova-compute[71972]: WARNING nova.compute.manager [req-bf510839-e991-4051-ad35-72a32802809f req-0b1c7c29-7e35-47cf-a438-4c540a692d42 service nova] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Received unexpected event network-vif-plugged-2515c5ff-fad2-4304-a78e-c4d3ab3fdfc7 for instance with vm_state deleted and task_state None. Apr 17 22:13:45 user nova-compute[71972]: DEBUG nova.compute.manager [req-bf510839-e991-4051-ad35-72a32802809f req-0b1c7c29-7e35-47cf-a438-4c540a692d42 service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] Received event network-vif-unplugged-63dc9a41-e89e-4673-a658-7acddd88706f {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-bf510839-e991-4051-ad35-72a32802809f req-0b1c7c29-7e35-47cf-a438-4c540a692d42 service nova] Acquiring lock "97067629-e099-49fd-bb79-223dd4401405-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-bf510839-e991-4051-ad35-72a32802809f req-0b1c7c29-7e35-47cf-a438-4c540a692d42 service nova] Lock "97067629-e099-49fd-bb79-223dd4401405-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-bf510839-e991-4051-ad35-72a32802809f req-0b1c7c29-7e35-47cf-a438-4c540a692d42 service nova] Lock "97067629-e099-49fd-bb79-223dd4401405-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG nova.compute.manager [req-bf510839-e991-4051-ad35-72a32802809f req-0b1c7c29-7e35-47cf-a438-4c540a692d42 service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] No waiting events found dispatching network-vif-unplugged-63dc9a41-e89e-4673-a658-7acddd88706f {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG nova.compute.manager [req-bf510839-e991-4051-ad35-72a32802809f req-0b1c7c29-7e35-47cf-a438-4c540a692d42 service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] Received event network-vif-unplugged-63dc9a41-e89e-4673-a658-7acddd88706f for instance with task_state deleting. {{(pid=71972) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG nova.compute.manager [req-bf510839-e991-4051-ad35-72a32802809f req-0b1c7c29-7e35-47cf-a438-4c540a692d42 service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] Received event network-vif-plugged-63dc9a41-e89e-4673-a658-7acddd88706f {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-bf510839-e991-4051-ad35-72a32802809f req-0b1c7c29-7e35-47cf-a438-4c540a692d42 service nova] Acquiring lock "97067629-e099-49fd-bb79-223dd4401405-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-bf510839-e991-4051-ad35-72a32802809f req-0b1c7c29-7e35-47cf-a438-4c540a692d42 service nova] Lock "97067629-e099-49fd-bb79-223dd4401405-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-bf510839-e991-4051-ad35-72a32802809f req-0b1c7c29-7e35-47cf-a438-4c540a692d42 service nova] Lock "97067629-e099-49fd-bb79-223dd4401405-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:13:45 user nova-compute[71972]: DEBUG nova.compute.manager [req-bf510839-e991-4051-ad35-72a32802809f req-0b1c7c29-7e35-47cf-a438-4c540a692d42 service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] No waiting events found dispatching network-vif-plugged-63dc9a41-e89e-4673-a658-7acddd88706f {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:13:45 user nova-compute[71972]: WARNING nova.compute.manager [req-bf510839-e991-4051-ad35-72a32802809f req-0b1c7c29-7e35-47cf-a438-4c540a692d42 service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] Received unexpected event network-vif-plugged-63dc9a41-e89e-4673-a658-7acddd88706f for instance with vm_state rescued and task_state deleting. Apr 17 22:13:46 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: 97067629-e099-49fd-bb79-223dd4401405] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:13:46 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: 97067629-e099-49fd-bb79-223dd4401405] Took 0.45 seconds to deallocate network for instance. Apr 17 22:13:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-e41ea244-5f47-4ec8-856c-98e4054e4a66 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:13:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-e41ea244-5f47-4ec8-856c-98e4054e4a66 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:13:46 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-e41ea244-5f47-4ec8-856c-98e4054e4a66 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:13:46 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-e41ea244-5f47-4ec8-856c-98e4054e4a66 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:13:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-e41ea244-5f47-4ec8-856c-98e4054e4a66 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.212s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:13:46 user nova-compute[71972]: INFO nova.scheduler.client.report [None req-e41ea244-5f47-4ec8-856c-98e4054e4a66 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Deleted allocations for instance 97067629-e099-49fd-bb79-223dd4401405 Apr 17 22:13:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-e41ea244-5f47-4ec8-856c-98e4054e4a66 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "97067629-e099-49fd-bb79-223dd4401405" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.512s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:13:47 user nova-compute[71972]: DEBUG nova.compute.manager [req-7fb57d26-b067-4289-bcf5-f3c05e50a5f2 req-556704b4-e1bf-4d5d-820b-27dd7d269ff3 service nova] [instance: 97067629-e099-49fd-bb79-223dd4401405] Received event network-vif-deleted-63dc9a41-e89e-4673-a658-7acddd88706f {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:13:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:50 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:13:51 user nova-compute[71972]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:13:51 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] VM Stopped (Lifecycle Event) Apr 17 22:13:51 user nova-compute[71972]: DEBUG nova.compute.manager [None req-a6390294-ee15-4b71-b20c-7bbc228aea56 None None] [instance: 128ffde0-5149-48d2-a56e-c41418fbc753] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:13:55 user nova-compute[71972]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:13:55 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] VM Stopped (Lifecycle Event) Apr 17 22:13:55 user nova-compute[71972]: DEBUG nova.compute.manager [None req-26404b3c-1fbf-4b26-abc7-cea755214f28 None None] [instance: e4e13341-82c2-4b86-8b5a-e12d435513ee] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:13:55 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:13:59 user nova-compute[71972]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:13:59 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] VM Stopped (Lifecycle Event) Apr 17 22:13:59 user nova-compute[71972]: DEBUG nova.compute.manager [None req-6c4fda09-fe8e-4cdc-9ec7-396e3d5be02c None None] [instance: 2b53a15a-2e55-4c9e-976b-addb176545fa] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:14:00 user nova-compute[71972]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:14:00 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: 97067629-e099-49fd-bb79-223dd4401405] VM Stopped (Lifecycle Event) Apr 17 22:14:00 user nova-compute[71972]: DEBUG nova.compute.manager [None req-85f9b7c8-3d5b-43f0-9a8b-2505c20e43a1 None None] [instance: 97067629-e099-49fd-bb79-223dd4401405] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:14:00 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:14:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:10 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:14:15 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:20 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:14:25 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:14:25 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f08c0883-6426-4d16-80b1-67b4189ad09f tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Acquiring lock "cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:25 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f08c0883-6426-4d16-80b1-67b4189ad09f tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:25 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f08c0883-6426-4d16-80b1-67b4189ad09f tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Acquiring lock "cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:25 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f08c0883-6426-4d16-80b1-67b4189ad09f tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:25 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f08c0883-6426-4d16-80b1-67b4189ad09f tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:25 user nova-compute[71972]: INFO nova.compute.manager [None req-f08c0883-6426-4d16-80b1-67b4189ad09f tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Terminating instance Apr 17 22:14:25 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f08c0883-6426-4d16-80b1-67b4189ad09f tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Start destroying the instance on the hypervisor. {{(pid=71972) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 22:14:25 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:26 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:26 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:26 user nova-compute[71972]: DEBUG nova.compute.manager [req-62d856bb-c7f9-43e2-a0c1-24b0bfcf0898 req-668ddcc6-5fc8-4cde-9313-3943661c4dd1 service nova] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Received event network-vif-unplugged-bc18e018-6929-44c7-be4a-53fc82ef85e1 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:14:26 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-62d856bb-c7f9-43e2-a0c1-24b0bfcf0898 req-668ddcc6-5fc8-4cde-9313-3943661c4dd1 service nova] Acquiring lock "cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:26 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-62d856bb-c7f9-43e2-a0c1-24b0bfcf0898 req-668ddcc6-5fc8-4cde-9313-3943661c4dd1 service nova] Lock "cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:26 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-62d856bb-c7f9-43e2-a0c1-24b0bfcf0898 req-668ddcc6-5fc8-4cde-9313-3943661c4dd1 service nova] Lock "cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:26 user nova-compute[71972]: DEBUG nova.compute.manager [req-62d856bb-c7f9-43e2-a0c1-24b0bfcf0898 req-668ddcc6-5fc8-4cde-9313-3943661c4dd1 service nova] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] No waiting events found dispatching network-vif-unplugged-bc18e018-6929-44c7-be4a-53fc82ef85e1 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:14:26 user nova-compute[71972]: DEBUG nova.compute.manager [req-62d856bb-c7f9-43e2-a0c1-24b0bfcf0898 req-668ddcc6-5fc8-4cde-9313-3943661c4dd1 service nova] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Received event network-vif-unplugged-bc18e018-6929-44c7-be4a-53fc82ef85e1 for instance with task_state deleting. {{(pid=71972) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 22:14:26 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:26 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Instance destroyed successfully. Apr 17 22:14:26 user nova-compute[71972]: DEBUG nova.objects.instance [None req-f08c0883-6426-4d16-80b1-67b4189ad09f tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lazy-loading 'resources' on Instance uuid cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:14:26 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-f08c0883-6426-4d16-80b1-67b4189ad09f tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:12:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-1515339523',display_name='tempest-TestMinimumBasicScenario-server-1515339523',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-testminimumbasicscenario-server-1515339523',id=16,image_ref='7caab0a1-de3d-4117-b612-096de189bac9',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCsXE6OrnGHKpZRfRRcidgOPWNw8nV1m4nQYTe67gmTvGnqZQ+yuBJwyAgcKnu1jiPhFMF5q36cDq/h1aU41fT0gQG5M5HcCdSjY9QHTwnJNWH/cKSA+xRQ7GbYrnZAxmQ==',key_name='tempest-TestMinimumBasicScenario-2023545159',keypairs=,launch_index=0,launched_at=2023-04-17T22:12:40Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='a82db257b3494faca3f3759644a51b30',ramdisk_id='',reservation_id='r-xr2x064l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7caab0a1-de3d-4117-b612-096de189bac9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-475067891',owner_user_name='tempest-TestMinimumBasicScenario-475067891-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T22:12:40Z,user_data=None,user_id='e11872391f1a487a8a8ba5a6d13589f1',uuid=cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bc18e018-6929-44c7-be4a-53fc82ef85e1", "address": "fa:16:3e:6d:de:30", "network": {"id": "a91184a7-8ecf-419a-a191-2853dd054c4b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1266987381-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a82db257b3494faca3f3759644a51b30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc18e018-69", "ovs_interfaceid": "bc18e018-6929-44c7-be4a-53fc82ef85e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 22:14:26 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-f08c0883-6426-4d16-80b1-67b4189ad09f tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Converting VIF {"id": "bc18e018-6929-44c7-be4a-53fc82ef85e1", "address": "fa:16:3e:6d:de:30", "network": {"id": "a91184a7-8ecf-419a-a191-2853dd054c4b", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1266987381-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a82db257b3494faca3f3759644a51b30", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapbc18e018-69", "ovs_interfaceid": "bc18e018-6929-44c7-be4a-53fc82ef85e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:14:26 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-f08c0883-6426-4d16-80b1-67b4189ad09f tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:de:30,bridge_name='br-int',has_traffic_filtering=True,id=bc18e018-6929-44c7-be4a-53fc82ef85e1,network=Network(a91184a7-8ecf-419a-a191-2853dd054c4b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc18e018-69') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:14:26 user nova-compute[71972]: DEBUG os_vif [None req-f08c0883-6426-4d16-80b1-67b4189ad09f tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:de:30,bridge_name='br-int',has_traffic_filtering=True,id=bc18e018-6929-44c7-be4a-53fc82ef85e1,network=Network(a91184a7-8ecf-419a-a191-2853dd054c4b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc18e018-69') {{(pid=71972) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 22:14:26 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:26 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbc18e018-69, bridge=br-int, if_exists=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:14:26 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:26 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:26 user nova-compute[71972]: INFO os_vif [None req-f08c0883-6426-4d16-80b1-67b4189ad09f tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:de:30,bridge_name='br-int',has_traffic_filtering=True,id=bc18e018-6929-44c7-be4a-53fc82ef85e1,network=Network(a91184a7-8ecf-419a-a191-2853dd054c4b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbc18e018-69') Apr 17 22:14:26 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-f08c0883-6426-4d16-80b1-67b4189ad09f tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Deleting instance files /opt/stack/data/nova/instances/cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b_del Apr 17 22:14:26 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-f08c0883-6426-4d16-80b1-67b4189ad09f tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Deletion of /opt/stack/data/nova/instances/cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b_del complete Apr 17 22:14:26 user nova-compute[71972]: INFO nova.compute.manager [None req-f08c0883-6426-4d16-80b1-67b4189ad09f tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Took 0.86 seconds to destroy the instance on the hypervisor. Apr 17 22:14:26 user nova-compute[71972]: DEBUG oslo.service.loopingcall [None req-f08c0883-6426-4d16-80b1-67b4189ad09f tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71972) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 22:14:26 user nova-compute[71972]: DEBUG nova.compute.manager [-] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Deallocating network for instance {{(pid=71972) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 22:14:26 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] deallocate_for_instance() {{(pid=71972) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 22:14:27 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:14:27 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Took 0.60 seconds to deallocate network for instance. Apr 17 22:14:27 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f08c0883-6426-4d16-80b1-67b4189ad09f tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:27 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f08c0883-6426-4d16-80b1-67b4189ad09f tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:27 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f08c0883-6426-4d16-80b1-67b4189ad09f tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:14:27 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f08c0883-6426-4d16-80b1-67b4189ad09f tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:14:27 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f08c0883-6426-4d16-80b1-67b4189ad09f tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.197s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:27 user nova-compute[71972]: INFO nova.scheduler.client.report [None req-f08c0883-6426-4d16-80b1-67b4189ad09f tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Deleted allocations for instance cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b Apr 17 22:14:27 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f08c0883-6426-4d16-80b1-67b4189ad09f tempest-TestMinimumBasicScenario-475067891 tempest-TestMinimumBasicScenario-475067891-project-member] Lock "cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.837s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:28 user nova-compute[71972]: DEBUG nova.compute.manager [req-5481222e-de1d-4a30-a771-6e64ee20ff5b req-8bd6f276-28f2-416e-a0d4-2ae3e2810a3d service nova] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Received event network-vif-plugged-bc18e018-6929-44c7-be4a-53fc82ef85e1 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:14:28 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-5481222e-de1d-4a30-a771-6e64ee20ff5b req-8bd6f276-28f2-416e-a0d4-2ae3e2810a3d service nova] Acquiring lock "cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:28 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-5481222e-de1d-4a30-a771-6e64ee20ff5b req-8bd6f276-28f2-416e-a0d4-2ae3e2810a3d service nova] Lock "cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:28 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-5481222e-de1d-4a30-a771-6e64ee20ff5b req-8bd6f276-28f2-416e-a0d4-2ae3e2810a3d service nova] Lock "cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:28 user nova-compute[71972]: DEBUG nova.compute.manager [req-5481222e-de1d-4a30-a771-6e64ee20ff5b req-8bd6f276-28f2-416e-a0d4-2ae3e2810a3d service nova] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] No waiting events found dispatching network-vif-plugged-bc18e018-6929-44c7-be4a-53fc82ef85e1 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:14:28 user nova-compute[71972]: WARNING nova.compute.manager [req-5481222e-de1d-4a30-a771-6e64ee20ff5b req-8bd6f276-28f2-416e-a0d4-2ae3e2810a3d service nova] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Received unexpected event network-vif-plugged-bc18e018-6929-44c7-be4a-53fc82ef85e1 for instance with vm_state deleted and task_state None. Apr 17 22:14:28 user nova-compute[71972]: DEBUG nova.compute.manager [req-5481222e-de1d-4a30-a771-6e64ee20ff5b req-8bd6f276-28f2-416e-a0d4-2ae3e2810a3d service nova] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Received event network-vif-deleted-bc18e018-6929-44c7-be4a-53fc82ef85e1 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:14:29 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquiring lock "559768be-5a58-42c1-bbe8-e87684a0f772" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:29 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "559768be-5a58-42c1-bbe8-e87684a0f772" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:29 user nova-compute[71972]: DEBUG nova.compute.manager [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Starting instance... {{(pid=71972) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 22:14:29 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:29 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:29 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71972) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 22:14:29 user nova-compute[71972]: INFO nova.compute.claims [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Claim successful on node user Apr 17 22:14:29 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:14:29 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:14:29 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.271s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:29 user nova-compute[71972]: DEBUG nova.compute.manager [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Start building networks asynchronously for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 22:14:29 user nova-compute[71972]: DEBUG nova.compute.manager [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Allocating IP information in the background. {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 22:14:29 user nova-compute[71972]: DEBUG nova.network.neutron [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] allocate_for_instance() {{(pid=71972) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 22:14:29 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-acf357e3-c52c-4d9a-8a06-eb4de8b21211 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Acquiring lock "d9386728-7c3f-42ff-8f1c-51748ccefff3" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:29 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-acf357e3-c52c-4d9a-8a06-eb4de8b21211 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "d9386728-7c3f-42ff-8f1c-51748ccefff3" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:29 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-acf357e3-c52c-4d9a-8a06-eb4de8b21211 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Acquiring lock "d9386728-7c3f-42ff-8f1c-51748ccefff3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:29 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-acf357e3-c52c-4d9a-8a06-eb4de8b21211 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "d9386728-7c3f-42ff-8f1c-51748ccefff3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:29 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-acf357e3-c52c-4d9a-8a06-eb4de8b21211 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "d9386728-7c3f-42ff-8f1c-51748ccefff3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:29 user nova-compute[71972]: INFO nova.compute.manager [None req-acf357e3-c52c-4d9a-8a06-eb4de8b21211 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Terminating instance Apr 17 22:14:29 user nova-compute[71972]: DEBUG nova.compute.manager [None req-acf357e3-c52c-4d9a-8a06-eb4de8b21211 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Start destroying the instance on the hypervisor. {{(pid=71972) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 22:14:29 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 22:14:30 user nova-compute[71972]: DEBUG nova.compute.manager [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Start building block device mappings for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG nova.compute.manager [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Start spawning the instance on the hypervisor. {{(pid=71972) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Creating instance directory {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 22:14:30 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Creating image(s) Apr 17 22:14:30 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquiring lock "/opt/stack/data/nova/instances/559768be-5a58-42c1-bbe8-e87684a0f772/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "/opt/stack/data/nova/instances/559768be-5a58-42c1-bbe8-e87684a0f772/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "/opt/stack/data/nova/instances/559768be-5a58-42c1-bbe8-e87684a0f772/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG nova.policy [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '52b3e35c03b54ae4b5dabfb1325886a9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e52724ed9bc54905bd5eddd8504e4c77', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71972) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.146s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquiring lock "fde46b8e739fd6213d1525690f3ccf27384ee720" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG nova.compute.manager [req-ed36de49-06ad-4139-9251-9c3032440542 req-cdd4d72c-f9ae-4721-8d9c-18e74d229d57 service nova] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Received event network-vif-unplugged-54d6ec48-0412-4678-9745-e657a446347d {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-ed36de49-06ad-4139-9251-9c3032440542 req-cdd4d72c-f9ae-4721-8d9c-18e74d229d57 service nova] Acquiring lock "d9386728-7c3f-42ff-8f1c-51748ccefff3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-ed36de49-06ad-4139-9251-9c3032440542 req-cdd4d72c-f9ae-4721-8d9c-18e74d229d57 service nova] Lock "d9386728-7c3f-42ff-8f1c-51748ccefff3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-ed36de49-06ad-4139-9251-9c3032440542 req-cdd4d72c-f9ae-4721-8d9c-18e74d229d57 service nova] Lock "d9386728-7c3f-42ff-8f1c-51748ccefff3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG nova.compute.manager [req-ed36de49-06ad-4139-9251-9c3032440542 req-cdd4d72c-f9ae-4721-8d9c-18e74d229d57 service nova] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] No waiting events found dispatching network-vif-unplugged-54d6ec48-0412-4678-9745-e657a446347d {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG nova.compute.manager [req-ed36de49-06ad-4139-9251-9c3032440542 req-cdd4d72c-f9ae-4721-8d9c-18e74d229d57 service nova] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Received event network-vif-unplugged-54d6ec48-0412-4678-9745-e657a446347d for instance with task_state deleting. {{(pid=71972) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.137s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/559768be-5a58-42c1-bbe8-e87684a0f772/disk 1073741824 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/559768be-5a58-42c1-bbe8-e87684a0f772/disk 1073741824" returned: 0 in 0.046s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.188s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.147s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Checking if we can resize image /opt/stack/data/nova/instances/559768be-5a58-42c1-bbe8-e87684a0f772/disk. size=1073741824 {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/559768be-5a58-42c1-bbe8-e87684a0f772/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:14:30 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Instance destroyed successfully. Apr 17 22:14:30 user nova-compute[71972]: DEBUG nova.objects.instance [None req-acf357e3-c52c-4d9a-8a06-eb4de8b21211 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lazy-loading 'resources' on Instance uuid d9386728-7c3f-42ff-8f1c-51748ccefff3 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-acf357e3-c52c-4d9a-8a06-eb4de8b21211 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:09:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-1397556093',display_name='tempest-VolumesAdminNegativeTest-server-1397556093',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-volumesadminnegativetest-server-1397556093',id=11,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDFeeT2ktb2ezEbDneceaEkdmYL0Q+OMfaTSuKEyaa3rHRe7RFynAalW4DVkVwOj9o4faZsUrmyKgb2c54eejkRheCP0UOpP4kS7Z/K/ZA1t+erJPufbj/hsftZaYU+qdg==',key_name='tempest-keypair-1691187973',keypairs=,launch_index=0,launched_at=2023-04-17T22:10:00Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='416fcd7cd2bc486884f751acab268fd8',ramdisk_id='',reservation_id='r-yqs02rbn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-VolumesAdminNegativeTest-315022411',owner_user_name='tempest-VolumesAdminNegativeTest-315022411-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T22:10:00Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='70af0dc4dbf24ae1add76f3c87f8b1b5',uuid=d9386728-7c3f-42ff-8f1c-51748ccefff3,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "54d6ec48-0412-4678-9745-e657a446347d", "address": "fa:16:3e:c6:26:04", "network": {"id": "83a63ee2-d2cb-414d-bab9-c556ee1c2c88", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-625420379-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "416fcd7cd2bc486884f751acab268fd8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d6ec48-04", "ovs_interfaceid": "54d6ec48-0412-4678-9745-e657a446347d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-acf357e3-c52c-4d9a-8a06-eb4de8b21211 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Converting VIF {"id": "54d6ec48-0412-4678-9745-e657a446347d", "address": "fa:16:3e:c6:26:04", "network": {"id": "83a63ee2-d2cb-414d-bab9-c556ee1c2c88", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-625420379-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.222", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "416fcd7cd2bc486884f751acab268fd8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d6ec48-04", "ovs_interfaceid": "54d6ec48-0412-4678-9745-e657a446347d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-acf357e3-c52c-4d9a-8a06-eb4de8b21211 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c6:26:04,bridge_name='br-int',has_traffic_filtering=True,id=54d6ec48-0412-4678-9745-e657a446347d,network=Network(83a63ee2-d2cb-414d-bab9-c556ee1c2c88),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54d6ec48-04') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG os_vif [None req-acf357e3-c52c-4d9a-8a06-eb4de8b21211 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c6:26:04,bridge_name='br-int',has_traffic_filtering=True,id=54d6ec48-0412-4678-9745-e657a446347d,network=Network(83a63ee2-d2cb-414d-bab9-c556ee1c2c88),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54d6ec48-04') {{(pid=71972) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap54d6ec48-04, bridge=br-int, if_exists=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/559768be-5a58-42c1-bbe8-e87684a0f772/disk --force-share --output=json" returned: 0 in 0.141s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Cannot resize image /opt/stack/data/nova/instances/559768be-5a58-42c1-bbe8-e87684a0f772/disk to a smaller size. {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG nova.objects.instance [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lazy-loading 'migration_context' on Instance uuid 559768be-5a58-42c1-bbe8-e87684a0f772 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:14:30 user nova-compute[71972]: INFO os_vif [None req-acf357e3-c52c-4d9a-8a06-eb4de8b21211 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c6:26:04,bridge_name='br-int',has_traffic_filtering=True,id=54d6ec48-0412-4678-9745-e657a446347d,network=Network(83a63ee2-d2cb-414d-bab9-c556ee1c2c88),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54d6ec48-04') Apr 17 22:14:30 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-acf357e3-c52c-4d9a-8a06-eb4de8b21211 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Deleting instance files /opt/stack/data/nova/instances/d9386728-7c3f-42ff-8f1c-51748ccefff3_del Apr 17 22:14:30 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-acf357e3-c52c-4d9a-8a06-eb4de8b21211 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Deletion of /opt/stack/data/nova/instances/d9386728-7c3f-42ff-8f1c-51748ccefff3_del complete Apr 17 22:14:30 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Created local disks {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Ensure instance console log exists: /opt/stack/data/nova/instances/559768be-5a58-42c1-bbe8-e87684a0f772/console.log {{(pid=71972) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:30 user nova-compute[71972]: INFO nova.compute.manager [None req-acf357e3-c52c-4d9a-8a06-eb4de8b21211 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Took 0.85 seconds to destroy the instance on the hypervisor. Apr 17 22:14:30 user nova-compute[71972]: DEBUG oslo.service.loopingcall [None req-acf357e3-c52c-4d9a-8a06-eb4de8b21211 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71972) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG nova.compute.manager [-] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Deallocating network for instance {{(pid=71972) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] deallocate_for_instance() {{(pid=71972) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 22:14:30 user nova-compute[71972]: DEBUG nova.network.neutron [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Successfully created port: 2459db09-685c-4338-848c-fc76939bf0da {{(pid=71972) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 22:14:31 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:14:31 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Took 0.86 seconds to deallocate network for instance. Apr 17 22:14:31 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-acf357e3-c52c-4d9a-8a06-eb4de8b21211 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:31 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-acf357e3-c52c-4d9a-8a06-eb4de8b21211 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:31 user nova-compute[71972]: DEBUG nova.compute.manager [req-f14795a6-cb25-4695-82b1-cefc0fab6380 req-cf3ab0c1-1a9e-4cb2-ae94-7f9dc2062ca3 service nova] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Received event network-vif-deleted-54d6ec48-0412-4678-9745-e657a446347d {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:14:31 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-acf357e3-c52c-4d9a-8a06-eb4de8b21211 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:14:31 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-acf357e3-c52c-4d9a-8a06-eb4de8b21211 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:14:31 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-acf357e3-c52c-4d9a-8a06-eb4de8b21211 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.209s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:31 user nova-compute[71972]: INFO nova.scheduler.client.report [None req-acf357e3-c52c-4d9a-8a06-eb4de8b21211 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Deleted allocations for instance d9386728-7c3f-42ff-8f1c-51748ccefff3 Apr 17 22:14:32 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-acf357e3-c52c-4d9a-8a06-eb4de8b21211 tempest-VolumesAdminNegativeTest-315022411 tempest-VolumesAdminNegativeTest-315022411-project-member] Lock "d9386728-7c3f-42ff-8f1c-51748ccefff3" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.092s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG nova.network.neutron [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Successfully updated port: 2459db09-685c-4338-848c-fc76939bf0da {{(pid=71972) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG nova.compute.manager [req-e249b7d9-cb1f-4a9c-9d42-3f3701b44344 req-71e9af6c-8756-4693-ab51-bc5adb5cfad9 service nova] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Received event network-changed-2459db09-685c-4338-848c-fc76939bf0da {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG nova.compute.manager [req-e249b7d9-cb1f-4a9c-9d42-3f3701b44344 req-71e9af6c-8756-4693-ab51-bc5adb5cfad9 service nova] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Refreshing instance network info cache due to event network-changed-2459db09-685c-4338-848c-fc76939bf0da. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-e249b7d9-cb1f-4a9c-9d42-3f3701b44344 req-71e9af6c-8756-4693-ab51-bc5adb5cfad9 service nova] Acquiring lock "refresh_cache-559768be-5a58-42c1-bbe8-e87684a0f772" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-e249b7d9-cb1f-4a9c-9d42-3f3701b44344 req-71e9af6c-8756-4693-ab51-bc5adb5cfad9 service nova] Acquired lock "refresh_cache-559768be-5a58-42c1-bbe8-e87684a0f772" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG nova.network.neutron [req-e249b7d9-cb1f-4a9c-9d42-3f3701b44344 req-71e9af6c-8756-4693-ab51-bc5adb5cfad9 service nova] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Refreshing network info cache for port 2459db09-685c-4338-848c-fc76939bf0da {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG nova.network.neutron [req-e249b7d9-cb1f-4a9c-9d42-3f3701b44344 req-71e9af6c-8756-4693-ab51-bc5adb5cfad9 service nova] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Instance cache missing network info. {{(pid=71972) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquiring lock "refresh_cache-559768be-5a58-42c1-bbe8-e87684a0f772" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG nova.compute.manager [req-b11dcc32-088e-41a5-a840-99c3b9f229dc req-6150bb80-815c-43c7-a9f2-0f6a8e1588b5 service nova] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Received event network-vif-plugged-54d6ec48-0412-4678-9745-e657a446347d {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-b11dcc32-088e-41a5-a840-99c3b9f229dc req-6150bb80-815c-43c7-a9f2-0f6a8e1588b5 service nova] Acquiring lock "d9386728-7c3f-42ff-8f1c-51748ccefff3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-b11dcc32-088e-41a5-a840-99c3b9f229dc req-6150bb80-815c-43c7-a9f2-0f6a8e1588b5 service nova] Lock "d9386728-7c3f-42ff-8f1c-51748ccefff3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-b11dcc32-088e-41a5-a840-99c3b9f229dc req-6150bb80-815c-43c7-a9f2-0f6a8e1588b5 service nova] Lock "d9386728-7c3f-42ff-8f1c-51748ccefff3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG nova.compute.manager [req-b11dcc32-088e-41a5-a840-99c3b9f229dc req-6150bb80-815c-43c7-a9f2-0f6a8e1588b5 service nova] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] No waiting events found dispatching network-vif-plugged-54d6ec48-0412-4678-9745-e657a446347d {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:14:32 user nova-compute[71972]: WARNING nova.compute.manager [req-b11dcc32-088e-41a5-a840-99c3b9f229dc req-6150bb80-815c-43c7-a9f2-0f6a8e1588b5 service nova] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Received unexpected event network-vif-plugged-54d6ec48-0412-4678-9745-e657a446347d for instance with vm_state deleted and task_state None. Apr 17 22:14:32 user nova-compute[71972]: DEBUG nova.network.neutron [req-e249b7d9-cb1f-4a9c-9d42-3f3701b44344 req-71e9af6c-8756-4693-ab51-bc5adb5cfad9 service nova] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-e249b7d9-cb1f-4a9c-9d42-3f3701b44344 req-71e9af6c-8756-4693-ab51-bc5adb5cfad9 service nova] Releasing lock "refresh_cache-559768be-5a58-42c1-bbe8-e87684a0f772" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquired lock "refresh_cache-559768be-5a58-42c1-bbe8-e87684a0f772" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG nova.network.neutron [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Building network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG nova.network.neutron [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Instance cache missing network info. {{(pid=71972) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG nova.network.neutron [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Updating instance_info_cache with network_info: [{"id": "2459db09-685c-4338-848c-fc76939bf0da", "address": "fa:16:3e:25:5d:e0", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2459db09-68", "ovs_interfaceid": "2459db09-685c-4338-848c-fc76939bf0da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Releasing lock "refresh_cache-559768be-5a58-42c1-bbe8-e87684a0f772" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG nova.compute.manager [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Instance network_info: |[{"id": "2459db09-685c-4338-848c-fc76939bf0da", "address": "fa:16:3e:25:5d:e0", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2459db09-68", "ovs_interfaceid": "2459db09-685c-4338-848c-fc76939bf0da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Start _get_guest_xml network_info=[{"id": "2459db09-685c-4338-848c-fc76939bf0da", "address": "fa:16:3e:25:5d:e0", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2459db09-68", "ovs_interfaceid": "2459db09-685c-4338-848c-fc76939bf0da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'image_id': '80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 22:14:32 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:14:32 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:14:32 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71972) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T22:06:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=), allow threads: True {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Flavor limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Image limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Flavor pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Image pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Got 1 possible topologies {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 22:14:32 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:14:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-2004521020',display_name='tempest-AttachVolumeNegativeTest-server-2004521020',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-2004521020',id=17,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEsdLKOyShDW6lsdw3PYvB4QT9qwFwLG8+mmqjbOtzoXRKJ/PazrnVxbPepdTwCtT1bx/zxUWI3ltRCf+Nv+ft4L7HR3JkoHrl3xtcd/Er5SEh21KxWlnMx/s+XIYUXdJg==',key_name='tempest-keypair-1141362177',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e52724ed9bc54905bd5eddd8504e4c77',ramdisk_id='',reservation_id='r-vgi6066b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-1678965362',owner_user_name='tempest-AttachVolumeNegativeTest-1678965362-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:14:30Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='52b3e35c03b54ae4b5dabfb1325886a9',uuid=559768be-5a58-42c1-bbe8-e87684a0f772,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2459db09-685c-4338-848c-fc76939bf0da", "address": "fa:16:3e:25:5d:e0", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2459db09-68", "ovs_interfaceid": "2459db09-685c-4338-848c-fc76939bf0da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71972) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 22:14:33 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Converting VIF {"id": "2459db09-685c-4338-848c-fc76939bf0da", "address": "fa:16:3e:25:5d:e0", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2459db09-68", "ovs_interfaceid": "2459db09-685c-4338-848c-fc76939bf0da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:14:33 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:25:5d:e0,bridge_name='br-int',has_traffic_filtering=True,id=2459db09-685c-4338-848c-fc76939bf0da,network=Network(bd9d53b7-0e22-42c4-8b91-49575cadf44f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2459db09-68') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:14:33 user nova-compute[71972]: DEBUG nova.objects.instance [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lazy-loading 'pci_devices' on Instance uuid 559768be-5a58-42c1-bbe8-e87684a0f772 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:14:33 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] End _get_guest_xml xml= Apr 17 22:14:33 user nova-compute[71972]: 559768be-5a58-42c1-bbe8-e87684a0f772 Apr 17 22:14:33 user nova-compute[71972]: instance-00000011 Apr 17 22:14:33 user nova-compute[71972]: 131072 Apr 17 22:14:33 user nova-compute[71972]: 1 Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: tempest-AttachVolumeNegativeTest-server-2004521020 Apr 17 22:14:33 user nova-compute[71972]: 2023-04-17 22:14:32 Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: 128 Apr 17 22:14:33 user nova-compute[71972]: 1 Apr 17 22:14:33 user nova-compute[71972]: 0 Apr 17 22:14:33 user nova-compute[71972]: 0 Apr 17 22:14:33 user nova-compute[71972]: 1 Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: tempest-AttachVolumeNegativeTest-1678965362-project-member Apr 17 22:14:33 user nova-compute[71972]: tempest-AttachVolumeNegativeTest-1678965362 Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: OpenStack Foundation Apr 17 22:14:33 user nova-compute[71972]: OpenStack Nova Apr 17 22:14:33 user nova-compute[71972]: 0.0.0 Apr 17 22:14:33 user nova-compute[71972]: 559768be-5a58-42c1-bbe8-e87684a0f772 Apr 17 22:14:33 user nova-compute[71972]: 559768be-5a58-42c1-bbe8-e87684a0f772 Apr 17 22:14:33 user nova-compute[71972]: Virtual Machine Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: hvm Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Nehalem Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: /dev/urandom Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: Apr 17 22:14:33 user nova-compute[71972]: {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 22:14:33 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:14:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-2004521020',display_name='tempest-AttachVolumeNegativeTest-server-2004521020',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-2004521020',id=17,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEsdLKOyShDW6lsdw3PYvB4QT9qwFwLG8+mmqjbOtzoXRKJ/PazrnVxbPepdTwCtT1bx/zxUWI3ltRCf+Nv+ft4L7HR3JkoHrl3xtcd/Er5SEh21KxWlnMx/s+XIYUXdJg==',key_name='tempest-keypair-1141362177',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e52724ed9bc54905bd5eddd8504e4c77',ramdisk_id='',reservation_id='r-vgi6066b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-1678965362',owner_user_name='tempest-AttachVolumeNegativeTest-1678965362-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:14:30Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='52b3e35c03b54ae4b5dabfb1325886a9',uuid=559768be-5a58-42c1-bbe8-e87684a0f772,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2459db09-685c-4338-848c-fc76939bf0da", "address": "fa:16:3e:25:5d:e0", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2459db09-68", "ovs_interfaceid": "2459db09-685c-4338-848c-fc76939bf0da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 22:14:33 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Converting VIF {"id": "2459db09-685c-4338-848c-fc76939bf0da", "address": "fa:16:3e:25:5d:e0", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2459db09-68", "ovs_interfaceid": "2459db09-685c-4338-848c-fc76939bf0da", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:14:33 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:25:5d:e0,bridge_name='br-int',has_traffic_filtering=True,id=2459db09-685c-4338-848c-fc76939bf0da,network=Network(bd9d53b7-0e22-42c4-8b91-49575cadf44f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2459db09-68') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:14:33 user nova-compute[71972]: DEBUG os_vif [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:25:5d:e0,bridge_name='br-int',has_traffic_filtering=True,id=2459db09-685c-4338-848c-fc76939bf0da,network=Network(bd9d53b7-0e22-42c4-8b91-49575cadf44f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2459db09-68') {{(pid=71972) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 22:14:33 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:33 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:14:33 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 22:14:33 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:33 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2459db09-68, may_exist=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:14:33 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2459db09-68, col_values=(('external_ids', {'iface-id': '2459db09-685c-4338-848c-fc76939bf0da', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:25:5d:e0', 'vm-uuid': '559768be-5a58-42c1-bbe8-e87684a0f772'}),)) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:14:33 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:33 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:14:33 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:33 user nova-compute[71972]: INFO os_vif [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:25:5d:e0,bridge_name='br-int',has_traffic_filtering=True,id=2459db09-685c-4338-848c-fc76939bf0da,network=Network(bd9d53b7-0e22-42c4-8b91-49575cadf44f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2459db09-68') Apr 17 22:14:33 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] No BDM found with device name vda, not building metadata. {{(pid=71972) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 22:14:33 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] No VIF found with MAC fa:16:3e:25:5d:e0, not building metadata {{(pid=71972) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 22:14:34 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:34 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:34 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:34 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:14:34 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Cleaning up deleted instances with incomplete migration {{(pid=71972) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 17 22:14:34 user nova-compute[71972]: DEBUG nova.compute.manager [req-12e67bbc-4097-421e-afb4-4408851463a7 req-bb83c088-7dcb-448f-9586-4dd5601dd284 service nova] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Received event network-vif-plugged-2459db09-685c-4338-848c-fc76939bf0da {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:14:34 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-12e67bbc-4097-421e-afb4-4408851463a7 req-bb83c088-7dcb-448f-9586-4dd5601dd284 service nova] Acquiring lock "559768be-5a58-42c1-bbe8-e87684a0f772-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:34 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-12e67bbc-4097-421e-afb4-4408851463a7 req-bb83c088-7dcb-448f-9586-4dd5601dd284 service nova] Lock "559768be-5a58-42c1-bbe8-e87684a0f772-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:34 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-12e67bbc-4097-421e-afb4-4408851463a7 req-bb83c088-7dcb-448f-9586-4dd5601dd284 service nova] Lock "559768be-5a58-42c1-bbe8-e87684a0f772-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:34 user nova-compute[71972]: DEBUG nova.compute.manager [req-12e67bbc-4097-421e-afb4-4408851463a7 req-bb83c088-7dcb-448f-9586-4dd5601dd284 service nova] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] No waiting events found dispatching network-vif-plugged-2459db09-685c-4338-848c-fc76939bf0da {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:14:34 user nova-compute[71972]: WARNING nova.compute.manager [req-12e67bbc-4097-421e-afb4-4408851463a7 req-bb83c088-7dcb-448f-9586-4dd5601dd284 service nova] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Received unexpected event network-vif-plugged-2459db09-685c-4338-848c-fc76939bf0da for instance with vm_state building and task_state spawning. Apr 17 22:14:34 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:34 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:34 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b4bab377-9c1c-45f1-9653-d0a2c79cf905 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Acquiring lock "9634492c-168d-4b49-941a-b89703571b73" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b4bab377-9c1c-45f1-9653-d0a2c79cf905 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "9634492c-168d-4b49-941a-b89703571b73" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b4bab377-9c1c-45f1-9653-d0a2c79cf905 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Acquiring lock "9634492c-168d-4b49-941a-b89703571b73-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b4bab377-9c1c-45f1-9653-d0a2c79cf905 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "9634492c-168d-4b49-941a-b89703571b73-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b4bab377-9c1c-45f1-9653-d0a2c79cf905 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "9634492c-168d-4b49-941a-b89703571b73-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:35 user nova-compute[71972]: INFO nova.compute.manager [None req-b4bab377-9c1c-45f1-9653-d0a2c79cf905 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Terminating instance Apr 17 22:14:35 user nova-compute[71972]: DEBUG nova.compute.manager [None req-b4bab377-9c1c-45f1-9653-d0a2c79cf905 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Start destroying the instance on the hypervisor. {{(pid=71972) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 22:14:35 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:35 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:35 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:35 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:35 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:35 user nova-compute[71972]: DEBUG nova.compute.manager [req-d67774f6-17d6-4782-9b9b-e507aaee5520 req-f3f0125a-a666-488a-a6bc-295e53efef72 service nova] [instance: 9634492c-168d-4b49-941a-b89703571b73] Received event network-vif-unplugged-d22f6b6c-44f0-472e-b05f-192e12d56f32 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:14:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-d67774f6-17d6-4782-9b9b-e507aaee5520 req-f3f0125a-a666-488a-a6bc-295e53efef72 service nova] Acquiring lock "9634492c-168d-4b49-941a-b89703571b73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-d67774f6-17d6-4782-9b9b-e507aaee5520 req-f3f0125a-a666-488a-a6bc-295e53efef72 service nova] Lock "9634492c-168d-4b49-941a-b89703571b73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-d67774f6-17d6-4782-9b9b-e507aaee5520 req-f3f0125a-a666-488a-a6bc-295e53efef72 service nova] Lock "9634492c-168d-4b49-941a-b89703571b73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:35 user nova-compute[71972]: DEBUG nova.compute.manager [req-d67774f6-17d6-4782-9b9b-e507aaee5520 req-f3f0125a-a666-488a-a6bc-295e53efef72 service nova] [instance: 9634492c-168d-4b49-941a-b89703571b73] No waiting events found dispatching network-vif-unplugged-d22f6b6c-44f0-472e-b05f-192e12d56f32 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:14:35 user nova-compute[71972]: DEBUG nova.compute.manager [req-d67774f6-17d6-4782-9b9b-e507aaee5520 req-f3f0125a-a666-488a-a6bc-295e53efef72 service nova] [instance: 9634492c-168d-4b49-941a-b89703571b73] Received event network-vif-unplugged-d22f6b6c-44f0-472e-b05f-192e12d56f32 for instance with task_state deleting. {{(pid=71972) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 22:14:36 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: 9634492c-168d-4b49-941a-b89703571b73] Instance destroyed successfully. Apr 17 22:14:36 user nova-compute[71972]: DEBUG nova.objects.instance [None req-b4bab377-9c1c-45f1-9653-d0a2c79cf905 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lazy-loading 'resources' on Instance uuid 9634492c-168d-4b49-941a-b89703571b73 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-b4bab377-9c1c-45f1-9653-d0a2c79cf905 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:08:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-364503782',display_name='tempest-ServerRescueNegativeTestJSON-server-364503782',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverrescuenegativetestjson-server-364503782',id=5,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-17T22:09:21Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='a863c30ce3844f0ba754b048c2833fa3',ramdisk_id='',reservation_id='r-x1sgqnl0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerRescueNegativeTestJSON-2008986942',owner_user_name='tempest-ServerRescueNegativeTestJSON-2008986942-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T22:09:22Z,user_data=None,user_id='b3e3003057e7456c933b762412442a3e',uuid=9634492c-168d-4b49-941a-b89703571b73,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d22f6b6c-44f0-472e-b05f-192e12d56f32", "address": "fa:16:3e:ab:f9:49", "network": {"id": "bd20802f-022c-4a11-8310-49b2375f642e", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-172869997-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a863c30ce3844f0ba754b048c2833fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd22f6b6c-44", "ovs_interfaceid": "d22f6b6c-44f0-472e-b05f-192e12d56f32", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-b4bab377-9c1c-45f1-9653-d0a2c79cf905 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Converting VIF {"id": "d22f6b6c-44f0-472e-b05f-192e12d56f32", "address": "fa:16:3e:ab:f9:49", "network": {"id": "bd20802f-022c-4a11-8310-49b2375f642e", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-172869997-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "a863c30ce3844f0ba754b048c2833fa3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapd22f6b6c-44", "ovs_interfaceid": "d22f6b6c-44f0-472e-b05f-192e12d56f32", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-b4bab377-9c1c-45f1-9653-d0a2c79cf905 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ab:f9:49,bridge_name='br-int',has_traffic_filtering=True,id=d22f6b6c-44f0-472e-b05f-192e12d56f32,network=Network(bd20802f-022c-4a11-8310-49b2375f642e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd22f6b6c-44') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG os_vif [None req-b4bab377-9c1c-45f1-9653-d0a2c79cf905 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ab:f9:49,bridge_name='br-int',has_traffic_filtering=True,id=d22f6b6c-44f0-472e-b05f-192e12d56f32,network=Network(bd20802f-022c-4a11-8310-49b2375f642e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd22f6b6c-44') {{(pid=71972) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd22f6b6c-44, bridge=br-int, if_exists=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:14:36 user nova-compute[71972]: INFO os_vif [None req-b4bab377-9c1c-45f1-9653-d0a2c79cf905 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ab:f9:49,bridge_name='br-int',has_traffic_filtering=True,id=d22f6b6c-44f0-472e-b05f-192e12d56f32,network=Network(bd20802f-022c-4a11-8310-49b2375f642e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd22f6b6c-44') Apr 17 22:14:36 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-b4bab377-9c1c-45f1-9653-d0a2c79cf905 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Deleting instance files /opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73_del Apr 17 22:14:36 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-b4bab377-9c1c-45f1-9653-d0a2c79cf905 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Deletion of /opt/stack/data/nova/instances/9634492c-168d-4b49-941a-b89703571b73_del complete Apr 17 22:14:36 user nova-compute[71972]: INFO nova.compute.manager [None req-b4bab377-9c1c-45f1-9653-d0a2c79cf905 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] [instance: 9634492c-168d-4b49-941a-b89703571b73] Took 0.83 seconds to destroy the instance on the hypervisor. Apr 17 22:14:36 user nova-compute[71972]: DEBUG oslo.service.loopingcall [None req-b4bab377-9c1c-45f1-9653-d0a2c79cf905 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71972) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG nova.compute.manager [-] [instance: 9634492c-168d-4b49-941a-b89703571b73] Deallocating network for instance {{(pid=71972) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: 9634492c-168d-4b49-941a-b89703571b73] deallocate_for_instance() {{(pid=71972) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Resumed> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:14:36 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] VM Resumed (Lifecycle Event) Apr 17 22:14:36 user nova-compute[71972]: DEBUG nova.compute.manager [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Instance event wait completed in 0 seconds for {{(pid=71972) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Guest created on hypervisor {{(pid=71972) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 22:14:36 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Instance spawned successfully. Apr 17 22:14:36 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Found default for hw_cdrom_bus of ide {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Found default for hw_disk_bus of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Found default for hw_input_bus of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Found default for hw_pointer_model of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Found default for hw_video_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Found default for hw_vif_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:14:36 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:14:36 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Started> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:14:36 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] VM Started (Lifecycle Event) Apr 17 22:14:36 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:14:36 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:14:36 user nova-compute[71972]: INFO nova.compute.manager [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Took 6.46 seconds to spawn the instance on the hypervisor. Apr 17 22:14:36 user nova-compute[71972]: DEBUG nova.compute.manager [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:14:36 user nova-compute[71972]: INFO nova.compute.manager [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Took 7.04 seconds to build instance. Apr 17 22:14:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-c6dc25db-f2e1-4e92-a776-1839e510a678 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "559768be-5a58-42c1-bbe8-e87684a0f772" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.136s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Auditing locally available compute resources for user (node: user) {{(pid=71972) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG nova.compute.manager [req-f4484c73-2faa-49b1-b59c-265e3af34d03 req-b48e4a80-522a-46d7-b483-1d960df06988 service nova] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Received event network-vif-plugged-2459db09-685c-4338-848c-fc76939bf0da {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-f4484c73-2faa-49b1-b59c-265e3af34d03 req-b48e4a80-522a-46d7-b483-1d960df06988 service nova] Acquiring lock "559768be-5a58-42c1-bbe8-e87684a0f772-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-f4484c73-2faa-49b1-b59c-265e3af34d03 req-b48e4a80-522a-46d7-b483-1d960df06988 service nova] Lock "559768be-5a58-42c1-bbe8-e87684a0f772-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-f4484c73-2faa-49b1-b59c-265e3af34d03 req-b48e4a80-522a-46d7-b483-1d960df06988 service nova] Lock "559768be-5a58-42c1-bbe8-e87684a0f772-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG nova.compute.manager [req-f4484c73-2faa-49b1-b59c-265e3af34d03 req-b48e4a80-522a-46d7-b483-1d960df06988 service nova] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] No waiting events found dispatching network-vif-plugged-2459db09-685c-4338-848c-fc76939bf0da {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:14:36 user nova-compute[71972]: WARNING nova.compute.manager [req-f4484c73-2faa-49b1-b59c-265e3af34d03 req-b48e4a80-522a-46d7-b483-1d960df06988 service nova] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Received unexpected event network-vif-plugged-2459db09-685c-4338-848c-fc76939bf0da for instance with vm_state active and task_state None. Apr 17 22:14:36 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:14:36 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:14:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:14:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:14:37 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: 9634492c-168d-4b49-941a-b89703571b73] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:14:37 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: 9634492c-168d-4b49-941a-b89703571b73] Took 0.79 seconds to deallocate network for instance. Apr 17 22:14:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json" returned: 0 in 0.161s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:14:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:14:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b4bab377-9c1c-45f1-9653-d0a2c79cf905 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b4bab377-9c1c-45f1-9653-d0a2c79cf905 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:14:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/559768be-5a58-42c1-bbe8-e87684a0f772/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:14:37 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-b4bab377-9c1c-45f1-9653-d0a2c79cf905 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:14:37 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-b4bab377-9c1c-45f1-9653-d0a2c79cf905 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:14:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b4bab377-9c1c-45f1-9653-d0a2c79cf905 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.240s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/559768be-5a58-42c1-bbe8-e87684a0f772/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:14:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/559768be-5a58-42c1-bbe8-e87684a0f772/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:14:37 user nova-compute[71972]: INFO nova.scheduler.client.report [None req-b4bab377-9c1c-45f1-9653-d0a2c79cf905 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Deleted allocations for instance 9634492c-168d-4b49-941a-b89703571b73 Apr 17 22:14:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b4bab377-9c1c-45f1-9653-d0a2c79cf905 tempest-ServerRescueNegativeTestJSON-2008986942 tempest-ServerRescueNegativeTestJSON-2008986942-project-member] Lock "9634492c-168d-4b49-941a-b89703571b73" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.111s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/559768be-5a58-42c1-bbe8-e87684a0f772/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:14:38 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:38 user nova-compute[71972]: DEBUG nova.compute.manager [req-7369e214-984f-4b7e-aeb7-fff0bbb33d99 req-14ae359d-13ee-458b-b204-309ae76a2cb7 service nova] [instance: 9634492c-168d-4b49-941a-b89703571b73] Received event network-vif-plugged-d22f6b6c-44f0-472e-b05f-192e12d56f32 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:14:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-7369e214-984f-4b7e-aeb7-fff0bbb33d99 req-14ae359d-13ee-458b-b204-309ae76a2cb7 service nova] Acquiring lock "9634492c-168d-4b49-941a-b89703571b73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-7369e214-984f-4b7e-aeb7-fff0bbb33d99 req-14ae359d-13ee-458b-b204-309ae76a2cb7 service nova] Lock "9634492c-168d-4b49-941a-b89703571b73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-7369e214-984f-4b7e-aeb7-fff0bbb33d99 req-14ae359d-13ee-458b-b204-309ae76a2cb7 service nova] Lock "9634492c-168d-4b49-941a-b89703571b73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:38 user nova-compute[71972]: DEBUG nova.compute.manager [req-7369e214-984f-4b7e-aeb7-fff0bbb33d99 req-14ae359d-13ee-458b-b204-309ae76a2cb7 service nova] [instance: 9634492c-168d-4b49-941a-b89703571b73] No waiting events found dispatching network-vif-plugged-d22f6b6c-44f0-472e-b05f-192e12d56f32 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:14:38 user nova-compute[71972]: WARNING nova.compute.manager [req-7369e214-984f-4b7e-aeb7-fff0bbb33d99 req-14ae359d-13ee-458b-b204-309ae76a2cb7 service nova] [instance: 9634492c-168d-4b49-941a-b89703571b73] Received unexpected event network-vif-plugged-d22f6b6c-44f0-472e-b05f-192e12d56f32 for instance with vm_state deleted and task_state None. Apr 17 22:14:38 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:14:38 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:14:38 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Hypervisor/Node resource view: name=user free_ram=8692MB free_disk=26.620624542236328GB free_vcpus=9 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71972) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 22:14:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:38 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 7bb0580b-727f-4168-9d56-56dcb4fa404e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:14:38 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 99cb7131-abb8-41d6-bddd-c3bc943b7678 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:14:38 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 559768be-5a58-42c1-bbe8-e87684a0f772 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:14:38 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Total usable vcpus: 12, total allocated vcpus: 3 {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 22:14:38 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Final resource view: name=user phys_ram=16023MB used_ram=896MB phys_disk=40GB used_disk=3GB total_vcpus=12 used_vcpus=3 pci_stats=[] {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 22:14:38 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:14:38 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:14:38 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Compute_service record updated for user:user {{(pid=71972) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 22:14:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.258s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:38 user nova-compute[71972]: DEBUG nova.compute.manager [req-47a72043-30fc-4086-94be-c6a1b0cedadd req-7e4ed5a9-f670-4259-9438-9d995a80ed77 service nova] [instance: 9634492c-168d-4b49-941a-b89703571b73] Received event network-vif-deleted-d22f6b6c-44f0-472e-b05f-192e12d56f32 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:14:39 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:14:39 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:14:40 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:14:40 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:14:40 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:14:40 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71972) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 22:14:41 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:41 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:14:41 user nova-compute[71972]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:14:41 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] VM Stopped (Lifecycle Event) Apr 17 22:14:41 user nova-compute[71972]: DEBUG nova.compute.manager [None req-dc299b08-a865-47b3-ac42-4f0e26744d61 None None] [instance: cd3d0c4d-cfa1-4f82-b0d9-c0263f3c929b] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:14:42 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:14:42 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Starting heal instance info cache {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 22:14:42 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Didn't find any instances for network info cache update. {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 17 22:14:43 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:43 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:14:44 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:14:44 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:14:44 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Cleaning up deleted instances {{(pid=71972) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 17 22:14:44 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] There are 0 instances to clean {{(pid=71972) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 17 22:14:45 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:14:45 user nova-compute[71972]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:14:45 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] VM Stopped (Lifecycle Event) Apr 17 22:14:45 user nova-compute[71972]: DEBUG nova.compute.manager [None req-ac9e22ca-18bb-4e19-acf0-c6a4cba38da5 None None] [instance: d9386728-7c3f-42ff-8f1c-51748ccefff3] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:14:45 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Acquiring lock "ae185464-abd9-412d-bde1-d667c074abf8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "ae185464-abd9-412d-bde1-d667c074abf8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:48 user nova-compute[71972]: DEBUG nova.compute.manager [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Starting instance... {{(pid=71972) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 22:14:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:48 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71972) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 22:14:48 user nova-compute[71972]: INFO nova.compute.claims [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Claim successful on node user Apr 17 22:14:48 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Refreshing inventories for resource provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 17 22:14:48 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Updating ProviderTree inventory for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 17 22:14:48 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Updating inventory in ProviderTree for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 17 22:14:48 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Refreshing aggregate associations for resource provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e, aggregates: None {{(pid=71972) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 17 22:14:48 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Refreshing trait associations for resource provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e, traits: HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_RESCUE_BFV,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NODE,HW_CPU_X86_SSE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE2,COMPUTE_VOLUME_MULTI_ATTACH {{(pid=71972) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 17 22:14:48 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:14:48 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:14:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.374s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:48 user nova-compute[71972]: DEBUG nova.compute.manager [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Start building networks asynchronously for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 22:14:49 user nova-compute[71972]: DEBUG nova.compute.manager [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Allocating IP information in the background. {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 22:14:49 user nova-compute[71972]: DEBUG nova.network.neutron [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] allocate_for_instance() {{(pid=71972) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 22:14:49 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 22:14:49 user nova-compute[71972]: DEBUG nova.compute.manager [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Start building block device mappings for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 22:14:49 user nova-compute[71972]: DEBUG nova.policy [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '30c12a4244db438ea682e545c378abe1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '19f2a3034fe9458682e1242c91e2ce45', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71972) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 22:14:49 user nova-compute[71972]: DEBUG nova.compute.manager [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Start spawning the instance on the hypervisor. {{(pid=71972) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 22:14:49 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Creating instance directory {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 22:14:49 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Creating image(s) Apr 17 22:14:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Acquiring lock "/opt/stack/data/nova/instances/ae185464-abd9-412d-bde1-d667c074abf8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "/opt/stack/data/nova/instances/ae185464-abd9-412d-bde1-d667c074abf8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "/opt/stack/data/nova/instances/ae185464-abd9-412d-bde1-d667c074abf8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:49 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:14:49 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.137s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:14:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Acquiring lock "fde46b8e739fd6213d1525690f3ccf27384ee720" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:49 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:14:49 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.133s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:14:49 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/ae185464-abd9-412d-bde1-d667c074abf8/disk 1073741824 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:14:49 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/ae185464-abd9-412d-bde1-d667c074abf8/disk 1073741824" returned: 0 in 0.045s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:14:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.184s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:49 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:14:49 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.133s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:14:49 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Checking if we can resize image /opt/stack/data/nova/instances/ae185464-abd9-412d-bde1-d667c074abf8/disk. size=1073741824 {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 22:14:49 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ae185464-abd9-412d-bde1-d667c074abf8/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:14:49 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ae185464-abd9-412d-bde1-d667c074abf8/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:14:49 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Cannot resize image /opt/stack/data/nova/instances/ae185464-abd9-412d-bde1-d667c074abf8/disk to a smaller size. {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 22:14:49 user nova-compute[71972]: DEBUG nova.objects.instance [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lazy-loading 'migration_context' on Instance uuid ae185464-abd9-412d-bde1-d667c074abf8 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:14:49 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Created local disks {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 22:14:49 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Ensure instance console log exists: /opt/stack/data/nova/instances/ae185464-abd9-412d-bde1-d667c074abf8/console.log {{(pid=71972) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 22:14:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:49 user nova-compute[71972]: DEBUG nova.network.neutron [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Successfully created port: 663fa6c4-7d6d-4083-8ceb-7d08ec368373 {{(pid=71972) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG nova.network.neutron [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Successfully updated port: 663fa6c4-7d6d-4083-8ceb-7d08ec368373 {{(pid=71972) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Acquiring lock "refresh_cache-ae185464-abd9-412d-bde1-d667c074abf8" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Acquired lock "refresh_cache-ae185464-abd9-412d-bde1-d667c074abf8" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG nova.network.neutron [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Building network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG nova.compute.manager [req-290b33de-2a81-40f7-a7f1-ec0b33066fa9 req-4e112374-a87d-4bde-8354-fffa01af9cb0 service nova] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Received event network-changed-663fa6c4-7d6d-4083-8ceb-7d08ec368373 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG nova.compute.manager [req-290b33de-2a81-40f7-a7f1-ec0b33066fa9 req-4e112374-a87d-4bde-8354-fffa01af9cb0 service nova] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Refreshing instance network info cache due to event network-changed-663fa6c4-7d6d-4083-8ceb-7d08ec368373. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-290b33de-2a81-40f7-a7f1-ec0b33066fa9 req-4e112374-a87d-4bde-8354-fffa01af9cb0 service nova] Acquiring lock "refresh_cache-ae185464-abd9-412d-bde1-d667c074abf8" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG nova.network.neutron [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Instance cache missing network info. {{(pid=71972) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG nova.network.neutron [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Updating instance_info_cache with network_info: [{"id": "663fa6c4-7d6d-4083-8ceb-7d08ec368373", "address": "fa:16:3e:e9:cb:55", "network": {"id": "8ffc4041-9c9c-4b0f-9342-dcd76540515a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-82115333-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "19f2a3034fe9458682e1242c91e2ce45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap663fa6c4-7d", "ovs_interfaceid": "663fa6c4-7d6d-4083-8ceb-7d08ec368373", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Releasing lock "refresh_cache-ae185464-abd9-412d-bde1-d667c074abf8" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG nova.compute.manager [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Instance network_info: |[{"id": "663fa6c4-7d6d-4083-8ceb-7d08ec368373", "address": "fa:16:3e:e9:cb:55", "network": {"id": "8ffc4041-9c9c-4b0f-9342-dcd76540515a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-82115333-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "19f2a3034fe9458682e1242c91e2ce45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap663fa6c4-7d", "ovs_interfaceid": "663fa6c4-7d6d-4083-8ceb-7d08ec368373", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-290b33de-2a81-40f7-a7f1-ec0b33066fa9 req-4e112374-a87d-4bde-8354-fffa01af9cb0 service nova] Acquired lock "refresh_cache-ae185464-abd9-412d-bde1-d667c074abf8" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG nova.network.neutron [req-290b33de-2a81-40f7-a7f1-ec0b33066fa9 req-4e112374-a87d-4bde-8354-fffa01af9cb0 service nova] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Refreshing network info cache for port 663fa6c4-7d6d-4083-8ceb-7d08ec368373 {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Start _get_guest_xml network_info=[{"id": "663fa6c4-7d6d-4083-8ceb-7d08ec368373", "address": "fa:16:3e:e9:cb:55", "network": {"id": "8ffc4041-9c9c-4b0f-9342-dcd76540515a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-82115333-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "19f2a3034fe9458682e1242c91e2ce45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap663fa6c4-7d", "ovs_interfaceid": "663fa6c4-7d6d-4083-8ceb-7d08ec368373", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'image_id': '80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 22:14:50 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:14:50 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:14:50 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71972) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T22:06:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=), allow threads: True {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Flavor limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Image limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Flavor pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Image pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Got 1 possible topologies {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:14:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1425692942',display_name='tempest-AttachVolumeTestJSON-server-1425692942',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-1425692942',id=18,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEigsT6gLV0q/K0pHew8brR9QnMOnMtp0npXVhkRkqKTM+AOv4+ez0mMtBSy/IoES+GPDYkutnXc9/wVhggYNrjJV46CJK3m/8xwidBZV1wpxHBatJMpbDsmiChUV8RfCg==',key_name='tempest-keypair-192135788',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='19f2a3034fe9458682e1242c91e2ce45',ramdisk_id='',reservation_id='r-e4k2g86x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-4448958',owner_user_name='tempest-AttachVolumeTestJSON-4448958-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:14:49Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='30c12a4244db438ea682e545c378abe1',uuid=ae185464-abd9-412d-bde1-d667c074abf8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "663fa6c4-7d6d-4083-8ceb-7d08ec368373", "address": "fa:16:3e:e9:cb:55", "network": {"id": "8ffc4041-9c9c-4b0f-9342-dcd76540515a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-82115333-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "19f2a3034fe9458682e1242c91e2ce45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap663fa6c4-7d", "ovs_interfaceid": "663fa6c4-7d6d-4083-8ceb-7d08ec368373", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71972) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Converting VIF {"id": "663fa6c4-7d6d-4083-8ceb-7d08ec368373", "address": "fa:16:3e:e9:cb:55", "network": {"id": "8ffc4041-9c9c-4b0f-9342-dcd76540515a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-82115333-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "19f2a3034fe9458682e1242c91e2ce45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap663fa6c4-7d", "ovs_interfaceid": "663fa6c4-7d6d-4083-8ceb-7d08ec368373", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e9:cb:55,bridge_name='br-int',has_traffic_filtering=True,id=663fa6c4-7d6d-4083-8ceb-7d08ec368373,network=Network(8ffc4041-9c9c-4b0f-9342-dcd76540515a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap663fa6c4-7d') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG nova.objects.instance [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lazy-loading 'pci_devices' on Instance uuid ae185464-abd9-412d-bde1-d667c074abf8 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:14:50 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] End _get_guest_xml xml= Apr 17 22:14:50 user nova-compute[71972]: ae185464-abd9-412d-bde1-d667c074abf8 Apr 17 22:14:50 user nova-compute[71972]: instance-00000012 Apr 17 22:14:50 user nova-compute[71972]: 131072 Apr 17 22:14:50 user nova-compute[71972]: 1 Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: tempest-AttachVolumeTestJSON-server-1425692942 Apr 17 22:14:50 user nova-compute[71972]: 2023-04-17 22:14:50 Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: 128 Apr 17 22:14:50 user nova-compute[71972]: 1 Apr 17 22:14:50 user nova-compute[71972]: 0 Apr 17 22:14:50 user nova-compute[71972]: 0 Apr 17 22:14:50 user nova-compute[71972]: 1 Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: tempest-AttachVolumeTestJSON-4448958-project-member Apr 17 22:14:50 user nova-compute[71972]: tempest-AttachVolumeTestJSON-4448958 Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: OpenStack Foundation Apr 17 22:14:50 user nova-compute[71972]: OpenStack Nova Apr 17 22:14:50 user nova-compute[71972]: 0.0.0 Apr 17 22:14:50 user nova-compute[71972]: ae185464-abd9-412d-bde1-d667c074abf8 Apr 17 22:14:50 user nova-compute[71972]: ae185464-abd9-412d-bde1-d667c074abf8 Apr 17 22:14:50 user nova-compute[71972]: Virtual Machine Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: hvm Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Nehalem Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: /dev/urandom Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: Apr 17 22:14:50 user nova-compute[71972]: {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 22:14:51 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:14:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1425692942',display_name='tempest-AttachVolumeTestJSON-server-1425692942',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-1425692942',id=18,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEigsT6gLV0q/K0pHew8brR9QnMOnMtp0npXVhkRkqKTM+AOv4+ez0mMtBSy/IoES+GPDYkutnXc9/wVhggYNrjJV46CJK3m/8xwidBZV1wpxHBatJMpbDsmiChUV8RfCg==',key_name='tempest-keypair-192135788',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='19f2a3034fe9458682e1242c91e2ce45',ramdisk_id='',reservation_id='r-e4k2g86x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-4448958',owner_user_name='tempest-AttachVolumeTestJSON-4448958-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:14:49Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='30c12a4244db438ea682e545c378abe1',uuid=ae185464-abd9-412d-bde1-d667c074abf8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "663fa6c4-7d6d-4083-8ceb-7d08ec368373", "address": "fa:16:3e:e9:cb:55", "network": {"id": "8ffc4041-9c9c-4b0f-9342-dcd76540515a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-82115333-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "19f2a3034fe9458682e1242c91e2ce45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap663fa6c4-7d", "ovs_interfaceid": "663fa6c4-7d6d-4083-8ceb-7d08ec368373", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 22:14:51 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Converting VIF {"id": "663fa6c4-7d6d-4083-8ceb-7d08ec368373", "address": "fa:16:3e:e9:cb:55", "network": {"id": "8ffc4041-9c9c-4b0f-9342-dcd76540515a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-82115333-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "19f2a3034fe9458682e1242c91e2ce45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap663fa6c4-7d", "ovs_interfaceid": "663fa6c4-7d6d-4083-8ceb-7d08ec368373", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:14:51 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e9:cb:55,bridge_name='br-int',has_traffic_filtering=True,id=663fa6c4-7d6d-4083-8ceb-7d08ec368373,network=Network(8ffc4041-9c9c-4b0f-9342-dcd76540515a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap663fa6c4-7d') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:14:51 user nova-compute[71972]: DEBUG os_vif [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:cb:55,bridge_name='br-int',has_traffic_filtering=True,id=663fa6c4-7d6d-4083-8ceb-7d08ec368373,network=Network(8ffc4041-9c9c-4b0f-9342-dcd76540515a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap663fa6c4-7d') {{(pid=71972) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 22:14:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:14:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 22:14:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap663fa6c4-7d, may_exist=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:14:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap663fa6c4-7d, col_values=(('external_ids', {'iface-id': '663fa6c4-7d6d-4083-8ceb-7d08ec368373', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e9:cb:55', 'vm-uuid': 'ae185464-abd9-412d-bde1-d667c074abf8'}),)) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:14:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:14:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:51 user nova-compute[71972]: INFO os_vif [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e9:cb:55,bridge_name='br-int',has_traffic_filtering=True,id=663fa6c4-7d6d-4083-8ceb-7d08ec368373,network=Network(8ffc4041-9c9c-4b0f-9342-dcd76540515a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap663fa6c4-7d') Apr 17 22:14:51 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] No BDM found with device name vda, not building metadata. {{(pid=71972) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 22:14:51 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] No VIF found with MAC fa:16:3e:e9:cb:55, not building metadata {{(pid=71972) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 22:14:51 user nova-compute[71972]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:14:51 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: 9634492c-168d-4b49-941a-b89703571b73] VM Stopped (Lifecycle Event) Apr 17 22:14:51 user nova-compute[71972]: DEBUG nova.compute.manager [None req-a0d88e81-3770-46d0-b141-04fd8fe9e26c None None] [instance: 9634492c-168d-4b49-941a-b89703571b73] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:14:51 user nova-compute[71972]: DEBUG nova.network.neutron [req-290b33de-2a81-40f7-a7f1-ec0b33066fa9 req-4e112374-a87d-4bde-8354-fffa01af9cb0 service nova] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Updated VIF entry in instance network info cache for port 663fa6c4-7d6d-4083-8ceb-7d08ec368373. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:14:51 user nova-compute[71972]: DEBUG nova.network.neutron [req-290b33de-2a81-40f7-a7f1-ec0b33066fa9 req-4e112374-a87d-4bde-8354-fffa01af9cb0 service nova] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Updating instance_info_cache with network_info: [{"id": "663fa6c4-7d6d-4083-8ceb-7d08ec368373", "address": "fa:16:3e:e9:cb:55", "network": {"id": "8ffc4041-9c9c-4b0f-9342-dcd76540515a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-82115333-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "19f2a3034fe9458682e1242c91e2ce45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap663fa6c4-7d", "ovs_interfaceid": "663fa6c4-7d6d-4083-8ceb-7d08ec368373", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:14:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-290b33de-2a81-40f7-a7f1-ec0b33066fa9 req-4e112374-a87d-4bde-8354-fffa01af9cb0 service nova] Releasing lock "refresh_cache-ae185464-abd9-412d-bde1-d667c074abf8" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:14:52 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:52 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:52 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:52 user nova-compute[71972]: DEBUG nova.compute.manager [req-5551b9a2-b529-4e32-8e2e-8a0bb539a036 req-2c59607c-ace1-44a4-909b-71ecbd052180 service nova] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Received event network-vif-plugged-663fa6c4-7d6d-4083-8ceb-7d08ec368373 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:14:52 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-5551b9a2-b529-4e32-8e2e-8a0bb539a036 req-2c59607c-ace1-44a4-909b-71ecbd052180 service nova] Acquiring lock "ae185464-abd9-412d-bde1-d667c074abf8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:52 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-5551b9a2-b529-4e32-8e2e-8a0bb539a036 req-2c59607c-ace1-44a4-909b-71ecbd052180 service nova] Lock "ae185464-abd9-412d-bde1-d667c074abf8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:52 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-5551b9a2-b529-4e32-8e2e-8a0bb539a036 req-2c59607c-ace1-44a4-909b-71ecbd052180 service nova] Lock "ae185464-abd9-412d-bde1-d667c074abf8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:52 user nova-compute[71972]: DEBUG nova.compute.manager [req-5551b9a2-b529-4e32-8e2e-8a0bb539a036 req-2c59607c-ace1-44a4-909b-71ecbd052180 service nova] [instance: ae185464-abd9-412d-bde1-d667c074abf8] No waiting events found dispatching network-vif-plugged-663fa6c4-7d6d-4083-8ceb-7d08ec368373 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:14:52 user nova-compute[71972]: WARNING nova.compute.manager [req-5551b9a2-b529-4e32-8e2e-8a0bb539a036 req-2c59607c-ace1-44a4-909b-71ecbd052180 service nova] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Received unexpected event network-vif-plugged-663fa6c4-7d6d-4083-8ceb-7d08ec368373 for instance with vm_state building and task_state spawning. Apr 17 22:14:52 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:52 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:52 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:52 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:52 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:54 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Resumed> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:14:54 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: ae185464-abd9-412d-bde1-d667c074abf8] VM Resumed (Lifecycle Event) Apr 17 22:14:54 user nova-compute[71972]: DEBUG nova.compute.manager [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Instance event wait completed in 0 seconds for {{(pid=71972) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 22:14:54 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Guest created on hypervisor {{(pid=71972) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 22:14:54 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Instance spawned successfully. Apr 17 22:14:54 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 22:14:54 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:14:54 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:14:54 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Found default for hw_cdrom_bus of ide {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:14:54 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Found default for hw_disk_bus of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:14:54 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Found default for hw_input_bus of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:14:54 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Found default for hw_pointer_model of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:14:54 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Found default for hw_video_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:14:54 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Found default for hw_vif_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:14:54 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: ae185464-abd9-412d-bde1-d667c074abf8] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:14:54 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Started> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:14:54 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: ae185464-abd9-412d-bde1-d667c074abf8] VM Started (Lifecycle Event) Apr 17 22:14:54 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:14:54 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:14:54 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: ae185464-abd9-412d-bde1-d667c074abf8] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:14:54 user nova-compute[71972]: INFO nova.compute.manager [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Took 5.34 seconds to spawn the instance on the hypervisor. Apr 17 22:14:54 user nova-compute[71972]: DEBUG nova.compute.manager [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:14:54 user nova-compute[71972]: DEBUG nova.compute.manager [req-4ec108c7-3404-4742-b4ae-c1e686a8a671 req-95a21b85-d473-44d9-b1ad-77e3b5c6b558 service nova] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Received event network-vif-plugged-663fa6c4-7d6d-4083-8ceb-7d08ec368373 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:14:54 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-4ec108c7-3404-4742-b4ae-c1e686a8a671 req-95a21b85-d473-44d9-b1ad-77e3b5c6b558 service nova] Acquiring lock "ae185464-abd9-412d-bde1-d667c074abf8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:14:54 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-4ec108c7-3404-4742-b4ae-c1e686a8a671 req-95a21b85-d473-44d9-b1ad-77e3b5c6b558 service nova] Lock "ae185464-abd9-412d-bde1-d667c074abf8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:14:54 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-4ec108c7-3404-4742-b4ae-c1e686a8a671 req-95a21b85-d473-44d9-b1ad-77e3b5c6b558 service nova] Lock "ae185464-abd9-412d-bde1-d667c074abf8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:54 user nova-compute[71972]: DEBUG nova.compute.manager [req-4ec108c7-3404-4742-b4ae-c1e686a8a671 req-95a21b85-d473-44d9-b1ad-77e3b5c6b558 service nova] [instance: ae185464-abd9-412d-bde1-d667c074abf8] No waiting events found dispatching network-vif-plugged-663fa6c4-7d6d-4083-8ceb-7d08ec368373 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:14:54 user nova-compute[71972]: WARNING nova.compute.manager [req-4ec108c7-3404-4742-b4ae-c1e686a8a671 req-95a21b85-d473-44d9-b1ad-77e3b5c6b558 service nova] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Received unexpected event network-vif-plugged-663fa6c4-7d6d-4083-8ceb-7d08ec368373 for instance with vm_state building and task_state spawning. Apr 17 22:14:54 user nova-compute[71972]: INFO nova.compute.manager [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Took 6.15 seconds to build instance. Apr 17 22:14:54 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7e63dfa1-f36f-4e8d-85e7-1d27e7e701c8 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "ae185464-abd9-412d-bde1-d667c074abf8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.242s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:14:56 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:14:57 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:01 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:06 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:07 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:12 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:13 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._sync_power_states {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:15:13 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Triggering sync for uuid 7bb0580b-727f-4168-9d56-56dcb4fa404e {{(pid=71972) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 17 22:15:13 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Triggering sync for uuid 99cb7131-abb8-41d6-bddd-c3bc943b7678 {{(pid=71972) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 17 22:15:13 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Triggering sync for uuid 559768be-5a58-42c1-bbe8-e87684a0f772 {{(pid=71972) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 17 22:15:13 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Triggering sync for uuid ae185464-abd9-412d-bde1-d667c074abf8 {{(pid=71972) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 17 22:15:13 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "7bb0580b-727f-4168-9d56-56dcb4fa404e" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:15:13 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "7bb0580b-727f-4168-9d56-56dcb4fa404e" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:15:13 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "99cb7131-abb8-41d6-bddd-c3bc943b7678" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:15:13 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "99cb7131-abb8-41d6-bddd-c3bc943b7678" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:15:13 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "559768be-5a58-42c1-bbe8-e87684a0f772" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:15:13 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "559768be-5a58-42c1-bbe8-e87684a0f772" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:15:13 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "ae185464-abd9-412d-bde1-d667c074abf8" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:15:13 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "ae185464-abd9-412d-bde1-d667c074abf8" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:15:13 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "559768be-5a58-42c1-bbe8-e87684a0f772" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.060s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:15:13 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "99cb7131-abb8-41d6-bddd-c3bc943b7678" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.064s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:15:13 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "ae185464-abd9-412d-bde1-d667c074abf8" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.065s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:15:13 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "7bb0580b-727f-4168-9d56-56dcb4fa404e" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.087s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:15:16 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:17 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:18 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:21 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:21 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:22 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:23 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:26 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:26 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:26 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:28 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:35 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:36 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:15:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:15:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:15:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:15:36 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Auditing locally available compute resources for user (node: user) {{(pid=71972) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 22:15:36 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:15:36 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:15:36 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:15:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json" returned: 0 in 0.138s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:15:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:15:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json" returned: 0 in 0.153s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:15:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:15:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:15:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/559768be-5a58-42c1-bbe8-e87684a0f772/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:15:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquiring lock "ac56fdf0-5568-434b-ad61-805634c2beeb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:15:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "ac56fdf0-5568-434b-ad61-805634c2beeb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:15:37 user nova-compute[71972]: DEBUG nova.compute.manager [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Starting instance... {{(pid=71972) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 22:15:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/559768be-5a58-42c1-bbe8-e87684a0f772/disk --force-share --output=json" returned: 0 in 0.147s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:15:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/559768be-5a58-42c1-bbe8-e87684a0f772/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:15:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:15:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:15:37 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71972) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 22:15:37 user nova-compute[71972]: INFO nova.compute.claims [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Claim successful on node user Apr 17 22:15:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/559768be-5a58-42c1-bbe8-e87684a0f772/disk --force-share --output=json" returned: 0 in 0.135s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:15:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ae185464-abd9-412d-bde1-d667c074abf8/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:15:37 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ae185464-abd9-412d-bde1-d667c074abf8/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:15:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ae185464-abd9-412d-bde1-d667c074abf8/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:15:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ae185464-abd9-412d-bde1-d667c074abf8/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:15:37 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:15:38 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:15:38 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:15:38 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Hypervisor/Node resource view: name=user free_ram=8631MB free_disk=26.599166870117188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71972) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.814s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG nova.compute.manager [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Start building networks asynchronously for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.024s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG nova.compute.manager [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Allocating IP information in the background. {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG nova.network.neutron [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] allocate_for_instance() {{(pid=71972) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 22:15:38 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 22:15:38 user nova-compute[71972]: DEBUG nova.compute.manager [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Start building block device mappings for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 7bb0580b-727f-4168-9d56-56dcb4fa404e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 99cb7131-abb8-41d6-bddd-c3bc943b7678 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 559768be-5a58-42c1-bbe8-e87684a0f772 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance ae185464-abd9-412d-bde1-d667c074abf8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance ac56fdf0-5568-434b-ad61-805634c2beeb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Total usable vcpus: 12, total allocated vcpus: 5 {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Final resource view: name=user phys_ram=16023MB used_ram=1152MB phys_disk=40GB used_disk=5GB total_vcpus=12 used_vcpus=5 pci_stats=[] {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG nova.policy [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6df5551ab4974747a0412ce089b770b0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '26967ac9e8cb45b6aea04a699d4a1eca', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71972) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG nova.compute.manager [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Start spawning the instance on the hypervisor. {{(pid=71972) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Creating instance directory {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 22:15:38 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Creating image(s) Apr 17 22:15:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquiring lock "/opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "/opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "/opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.143s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquiring lock "fde46b8e739fd6213d1525690f3ccf27384ee720" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Compute_service record updated for user:user {{(pid=71972) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.429s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.144s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk 1073741824 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk 1073741824" returned: 0 in 0.047s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.196s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:15:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:15:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.141s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:15:39 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Checking if we can resize image /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk. size=1073741824 {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 22:15:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:15:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json" returned: 0 in 0.156s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:15:39 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Cannot resize image /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk to a smaller size. {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 22:15:39 user nova-compute[71972]: DEBUG nova.objects.instance [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lazy-loading 'migration_context' on Instance uuid ac56fdf0-5568-434b-ad61-805634c2beeb {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:15:39 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Created local disks {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 22:15:39 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Ensure instance console log exists: /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/console.log {{(pid=71972) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 22:15:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:15:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:15:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:15:39 user nova-compute[71972]: DEBUG nova.network.neutron [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Successfully created port: 4acd5cd6-47e1-4450-8baa-092f9444c03f {{(pid=71972) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 22:15:39 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:15:39 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.network.neutron [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Successfully updated port: 4acd5cd6-47e1-4450-8baa-092f9444c03f {{(pid=71972) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquiring lock "refresh_cache-ac56fdf0-5568-434b-ad61-805634c2beeb" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquired lock "refresh_cache-ac56fdf0-5568-434b-ad61-805634c2beeb" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.network.neutron [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Building network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.compute.manager [req-b6a2c0e9-0774-42ef-b02f-a37b2f9e1ad0 req-87a93554-6178-4763-a6d9-036208566677 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Received event network-changed-4acd5cd6-47e1-4450-8baa-092f9444c03f {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.compute.manager [req-b6a2c0e9-0774-42ef-b02f-a37b2f9e1ad0 req-87a93554-6178-4763-a6d9-036208566677 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Refreshing instance network info cache due to event network-changed-4acd5cd6-47e1-4450-8baa-092f9444c03f. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-b6a2c0e9-0774-42ef-b02f-a37b2f9e1ad0 req-87a93554-6178-4763-a6d9-036208566677 service nova] Acquiring lock "refresh_cache-ac56fdf0-5568-434b-ad61-805634c2beeb" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.network.neutron [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Instance cache missing network info. {{(pid=71972) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.network.neutron [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Updating instance_info_cache with network_info: [{"id": "4acd5cd6-47e1-4450-8baa-092f9444c03f", "address": "fa:16:3e:5e:2a:3b", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4acd5cd6-47", "ovs_interfaceid": "4acd5cd6-47e1-4450-8baa-092f9444c03f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Releasing lock "refresh_cache-ac56fdf0-5568-434b-ad61-805634c2beeb" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.compute.manager [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Instance network_info: |[{"id": "4acd5cd6-47e1-4450-8baa-092f9444c03f", "address": "fa:16:3e:5e:2a:3b", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4acd5cd6-47", "ovs_interfaceid": "4acd5cd6-47e1-4450-8baa-092f9444c03f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-b6a2c0e9-0774-42ef-b02f-a37b2f9e1ad0 req-87a93554-6178-4763-a6d9-036208566677 service nova] Acquired lock "refresh_cache-ac56fdf0-5568-434b-ad61-805634c2beeb" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.network.neutron [req-b6a2c0e9-0774-42ef-b02f-a37b2f9e1ad0 req-87a93554-6178-4763-a6d9-036208566677 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Refreshing network info cache for port 4acd5cd6-47e1-4450-8baa-092f9444c03f {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Start _get_guest_xml network_info=[{"id": "4acd5cd6-47e1-4450-8baa-092f9444c03f", "address": "fa:16:3e:5e:2a:3b", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4acd5cd6-47", "ovs_interfaceid": "4acd5cd6-47e1-4450-8baa-092f9444c03f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'image_id': '80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 22:15:40 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:15:40 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71972) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T22:06:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=), allow threads: True {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Flavor limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Image limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Flavor pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Image pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Got 1 possible topologies {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:15:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1160681713',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1160681713',id=19,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='26967ac9e8cb45b6aea04a699d4a1eca',ramdisk_id='',reservation_id='r-lp93sb9o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-480550513',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:15:39Z,user_data=None,user_id='6df5551ab4974747a0412ce089b770b0',uuid=ac56fdf0-5568-434b-ad61-805634c2beeb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4acd5cd6-47e1-4450-8baa-092f9444c03f", "address": "fa:16:3e:5e:2a:3b", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4acd5cd6-47", "ovs_interfaceid": "4acd5cd6-47e1-4450-8baa-092f9444c03f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71972) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Converting VIF {"id": "4acd5cd6-47e1-4450-8baa-092f9444c03f", "address": "fa:16:3e:5e:2a:3b", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4acd5cd6-47", "ovs_interfaceid": "4acd5cd6-47e1-4450-8baa-092f9444c03f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:2a:3b,bridge_name='br-int',has_traffic_filtering=True,id=4acd5cd6-47e1-4450-8baa-092f9444c03f,network=Network(bf926b5c-baa2-457d-aab9-e2deea0b84c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4acd5cd6-47') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.objects.instance [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lazy-loading 'pci_devices' on Instance uuid ac56fdf0-5568-434b-ad61-805634c2beeb {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] End _get_guest_xml xml= Apr 17 22:15:40 user nova-compute[71972]: ac56fdf0-5568-434b-ad61-805634c2beeb Apr 17 22:15:40 user nova-compute[71972]: instance-00000013 Apr 17 22:15:40 user nova-compute[71972]: 131072 Apr 17 22:15:40 user nova-compute[71972]: 1 Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: tempest-ServerBootFromVolumeStableRescueTest-server-1160681713 Apr 17 22:15:40 user nova-compute[71972]: 2023-04-17 22:15:40 Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: 128 Apr 17 22:15:40 user nova-compute[71972]: 1 Apr 17 22:15:40 user nova-compute[71972]: 0 Apr 17 22:15:40 user nova-compute[71972]: 0 Apr 17 22:15:40 user nova-compute[71972]: 1 Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member Apr 17 22:15:40 user nova-compute[71972]: tempest-ServerBootFromVolumeStableRescueTest-480550513 Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: OpenStack Foundation Apr 17 22:15:40 user nova-compute[71972]: OpenStack Nova Apr 17 22:15:40 user nova-compute[71972]: 0.0.0 Apr 17 22:15:40 user nova-compute[71972]: ac56fdf0-5568-434b-ad61-805634c2beeb Apr 17 22:15:40 user nova-compute[71972]: ac56fdf0-5568-434b-ad61-805634c2beeb Apr 17 22:15:40 user nova-compute[71972]: Virtual Machine Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: hvm Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Nehalem Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: /dev/urandom Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: Apr 17 22:15:40 user nova-compute[71972]: {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:15:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1160681713',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1160681713',id=19,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='26967ac9e8cb45b6aea04a699d4a1eca',ramdisk_id='',reservation_id='r-lp93sb9o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-480550513',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:15:39Z,user_data=None,user_id='6df5551ab4974747a0412ce089b770b0',uuid=ac56fdf0-5568-434b-ad61-805634c2beeb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4acd5cd6-47e1-4450-8baa-092f9444c03f", "address": "fa:16:3e:5e:2a:3b", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4acd5cd6-47", "ovs_interfaceid": "4acd5cd6-47e1-4450-8baa-092f9444c03f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Converting VIF {"id": "4acd5cd6-47e1-4450-8baa-092f9444c03f", "address": "fa:16:3e:5e:2a:3b", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4acd5cd6-47", "ovs_interfaceid": "4acd5cd6-47e1-4450-8baa-092f9444c03f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:2a:3b,bridge_name='br-int',has_traffic_filtering=True,id=4acd5cd6-47e1-4450-8baa-092f9444c03f,network=Network(bf926b5c-baa2-457d-aab9-e2deea0b84c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4acd5cd6-47') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG os_vif [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:2a:3b,bridge_name='br-int',has_traffic_filtering=True,id=4acd5cd6-47e1-4450-8baa-092f9444c03f,network=Network(bf926b5c-baa2-457d-aab9-e2deea0b84c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4acd5cd6-47') {{(pid=71972) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4acd5cd6-47, may_exist=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4acd5cd6-47, col_values=(('external_ids', {'iface-id': '4acd5cd6-47e1-4450-8baa-092f9444c03f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5e:2a:3b', 'vm-uuid': 'ac56fdf0-5568-434b-ad61-805634c2beeb'}),)) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:40 user nova-compute[71972]: INFO os_vif [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:2a:3b,bridge_name='br-int',has_traffic_filtering=True,id=4acd5cd6-47e1-4450-8baa-092f9444c03f,network=Network(bf926b5c-baa2-457d-aab9-e2deea0b84c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4acd5cd6-47') Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] No BDM found with device name vda, not building metadata. {{(pid=71972) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] No VIF found with MAC fa:16:3e:5e:2a:3b, not building metadata {{(pid=71972) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71972) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.network.neutron [req-b6a2c0e9-0774-42ef-b02f-a37b2f9e1ad0 req-87a93554-6178-4763-a6d9-036208566677 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Updated VIF entry in instance network info cache for port 4acd5cd6-47e1-4450-8baa-092f9444c03f. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG nova.network.neutron [req-b6a2c0e9-0774-42ef-b02f-a37b2f9e1ad0 req-87a93554-6178-4763-a6d9-036208566677 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Updating instance_info_cache with network_info: [{"id": "4acd5cd6-47e1-4450-8baa-092f9444c03f", "address": "fa:16:3e:5e:2a:3b", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4acd5cd6-47", "ovs_interfaceid": "4acd5cd6-47e1-4450-8baa-092f9444c03f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:15:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-b6a2c0e9-0774-42ef-b02f-a37b2f9e1ad0 req-87a93554-6178-4763-a6d9-036208566677 service nova] Releasing lock "refresh_cache-ac56fdf0-5568-434b-ad61-805634c2beeb" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:15:41 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:15:41 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:15:41 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:41 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:41 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:42 user nova-compute[71972]: DEBUG nova.compute.manager [req-39322b8e-2611-421c-83aa-69af3c8c5529 req-756eefda-ba49-4bfb-9733-4512c05bb225 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Received event network-vif-plugged-4acd5cd6-47e1-4450-8baa-092f9444c03f {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:15:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-39322b8e-2611-421c-83aa-69af3c8c5529 req-756eefda-ba49-4bfb-9733-4512c05bb225 service nova] Acquiring lock "ac56fdf0-5568-434b-ad61-805634c2beeb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:15:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-39322b8e-2611-421c-83aa-69af3c8c5529 req-756eefda-ba49-4bfb-9733-4512c05bb225 service nova] Lock "ac56fdf0-5568-434b-ad61-805634c2beeb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:15:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-39322b8e-2611-421c-83aa-69af3c8c5529 req-756eefda-ba49-4bfb-9733-4512c05bb225 service nova] Lock "ac56fdf0-5568-434b-ad61-805634c2beeb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:15:42 user nova-compute[71972]: DEBUG nova.compute.manager [req-39322b8e-2611-421c-83aa-69af3c8c5529 req-756eefda-ba49-4bfb-9733-4512c05bb225 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] No waiting events found dispatching network-vif-plugged-4acd5cd6-47e1-4450-8baa-092f9444c03f {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:15:42 user nova-compute[71972]: WARNING nova.compute.manager [req-39322b8e-2611-421c-83aa-69af3c8c5529 req-756eefda-ba49-4bfb-9733-4512c05bb225 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Received unexpected event network-vif-plugged-4acd5cd6-47e1-4450-8baa-092f9444c03f for instance with vm_state building and task_state spawning. Apr 17 22:15:42 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:42 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:42 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:42 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:42 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:43 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:15:43 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Starting heal instance info cache {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 22:15:43 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Rebuilding the list of instances to heal {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 17 22:15:43 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Skipping network cache update for instance because it is Building. {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9805}} Apr 17 22:15:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "refresh_cache-7bb0580b-727f-4168-9d56-56dcb4fa404e" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:15:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquired lock "refresh_cache-7bb0580b-727f-4168-9d56-56dcb4fa404e" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:15:43 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Forcefully refreshing network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 17 22:15:43 user nova-compute[71972]: DEBUG nova.objects.instance [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lazy-loading 'info_cache' on Instance uuid 7bb0580b-727f-4168-9d56-56dcb4fa404e {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:15:43 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Resumed> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:15:43 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] VM Resumed (Lifecycle Event) Apr 17 22:15:43 user nova-compute[71972]: DEBUG nova.compute.manager [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Instance event wait completed in 0 seconds for {{(pid=71972) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 22:15:43 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Guest created on hypervisor {{(pid=71972) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 22:15:43 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Instance spawned successfully. Apr 17 22:15:43 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 22:15:43 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:15:43 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:15:43 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Found default for hw_cdrom_bus of ide {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:15:43 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Found default for hw_disk_bus of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:15:43 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Found default for hw_input_bus of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:15:43 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Found default for hw_pointer_model of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:15:43 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Found default for hw_video_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:15:43 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Found default for hw_vif_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:15:43 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:15:43 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Started> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:15:43 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] VM Started (Lifecycle Event) Apr 17 22:15:43 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:15:43 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:15:43 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:15:44 user nova-compute[71972]: INFO nova.compute.manager [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Took 5.37 seconds to spawn the instance on the hypervisor. Apr 17 22:15:44 user nova-compute[71972]: DEBUG nova.compute.manager [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:15:44 user nova-compute[71972]: INFO nova.compute.manager [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Took 6.53 seconds to build instance. Apr 17 22:15:44 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-4925adff-7b67-4056-a988-939c3b7b3fca tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "ac56fdf0-5568-434b-ad61-805634c2beeb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.649s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:15:44 user nova-compute[71972]: DEBUG nova.compute.manager [req-ddecab61-fa29-4494-b27a-a304b940c53a req-9f4332eb-1aa8-4e7c-be4b-8bbce1d4c43d service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Received event network-vif-plugged-4acd5cd6-47e1-4450-8baa-092f9444c03f {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:15:44 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-ddecab61-fa29-4494-b27a-a304b940c53a req-9f4332eb-1aa8-4e7c-be4b-8bbce1d4c43d service nova] Acquiring lock "ac56fdf0-5568-434b-ad61-805634c2beeb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:15:44 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-ddecab61-fa29-4494-b27a-a304b940c53a req-9f4332eb-1aa8-4e7c-be4b-8bbce1d4c43d service nova] Lock "ac56fdf0-5568-434b-ad61-805634c2beeb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:15:44 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-ddecab61-fa29-4494-b27a-a304b940c53a req-9f4332eb-1aa8-4e7c-be4b-8bbce1d4c43d service nova] Lock "ac56fdf0-5568-434b-ad61-805634c2beeb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:15:44 user nova-compute[71972]: DEBUG nova.compute.manager [req-ddecab61-fa29-4494-b27a-a304b940c53a req-9f4332eb-1aa8-4e7c-be4b-8bbce1d4c43d service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] No waiting events found dispatching network-vif-plugged-4acd5cd6-47e1-4450-8baa-092f9444c03f {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:15:44 user nova-compute[71972]: WARNING nova.compute.manager [req-ddecab61-fa29-4494-b27a-a304b940c53a req-9f4332eb-1aa8-4e7c-be4b-8bbce1d4c43d service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Received unexpected event network-vif-plugged-4acd5cd6-47e1-4450-8baa-092f9444c03f for instance with vm_state active and task_state None. Apr 17 22:15:44 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Updating instance_info_cache with network_info: [{"id": "06548a29-a501-4b57-97f1-8afe930c8463", "address": "fa:16:3e:d4:82:7c", "network": {"id": "3bbe159c-29cd-4095-9556-8169500b1716", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-840222337-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f21699c3400842d3a28e71b288a4aaff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap06548a29-a5", "ovs_interfaceid": "06548a29-a501-4b57-97f1-8afe930c8463", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:15:44 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Releasing lock "refresh_cache-7bb0580b-727f-4168-9d56-56dcb4fa404e" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:15:44 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Updated the network info_cache for instance {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 17 22:15:44 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:15:44 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:15:45 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:50 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:52 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:53 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:55 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:57 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:15:57 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:00 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:00 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:07 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:08 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:10 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:12 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:14 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:15 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:17 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:17 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:20 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:20 user nova-compute[71972]: DEBUG nova.compute.manager [req-25fa6656-c44a-410d-aff9-3c6efcc59eb4 req-417da33f-29d2-4f09-a201-9ba8e7a94a0d service nova] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Received event network-changed-2459db09-685c-4338-848c-fc76939bf0da {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:16:20 user nova-compute[71972]: DEBUG nova.compute.manager [req-25fa6656-c44a-410d-aff9-3c6efcc59eb4 req-417da33f-29d2-4f09-a201-9ba8e7a94a0d service nova] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Refreshing instance network info cache due to event network-changed-2459db09-685c-4338-848c-fc76939bf0da. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:16:20 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-25fa6656-c44a-410d-aff9-3c6efcc59eb4 req-417da33f-29d2-4f09-a201-9ba8e7a94a0d service nova] Acquiring lock "refresh_cache-559768be-5a58-42c1-bbe8-e87684a0f772" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:16:20 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-25fa6656-c44a-410d-aff9-3c6efcc59eb4 req-417da33f-29d2-4f09-a201-9ba8e7a94a0d service nova] Acquired lock "refresh_cache-559768be-5a58-42c1-bbe8-e87684a0f772" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:16:20 user nova-compute[71972]: DEBUG nova.network.neutron [req-25fa6656-c44a-410d-aff9-3c6efcc59eb4 req-417da33f-29d2-4f09-a201-9ba8e7a94a0d service nova] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Refreshing network info cache for port 2459db09-685c-4338-848c-fc76939bf0da {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:16:21 user nova-compute[71972]: DEBUG nova.network.neutron [req-25fa6656-c44a-410d-aff9-3c6efcc59eb4 req-417da33f-29d2-4f09-a201-9ba8e7a94a0d service nova] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Updated VIF entry in instance network info cache for port 2459db09-685c-4338-848c-fc76939bf0da. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:16:21 user nova-compute[71972]: DEBUG nova.network.neutron [req-25fa6656-c44a-410d-aff9-3c6efcc59eb4 req-417da33f-29d2-4f09-a201-9ba8e7a94a0d service nova] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Updating instance_info_cache with network_info: [{"id": "2459db09-685c-4338-848c-fc76939bf0da", "address": "fa:16:3e:25:5d:e0", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.10", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2459db09-68", "ovs_interfaceid": "2459db09-685c-4338-848c-fc76939bf0da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:16:21 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-25fa6656-c44a-410d-aff9-3c6efcc59eb4 req-417da33f-29d2-4f09-a201-9ba8e7a94a0d service nova] Releasing lock "refresh_cache-559768be-5a58-42c1-bbe8-e87684a0f772" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:16:21 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:22 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7478356f-64c3-4042-822f-473c4436cb97 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquiring lock "559768be-5a58-42c1-bbe8-e87684a0f772" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:16:22 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7478356f-64c3-4042-822f-473c4436cb97 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "559768be-5a58-42c1-bbe8-e87684a0f772" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:16:22 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7478356f-64c3-4042-822f-473c4436cb97 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquiring lock "559768be-5a58-42c1-bbe8-e87684a0f772-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:16:22 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7478356f-64c3-4042-822f-473c4436cb97 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "559768be-5a58-42c1-bbe8-e87684a0f772-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:16:22 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7478356f-64c3-4042-822f-473c4436cb97 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "559768be-5a58-42c1-bbe8-e87684a0f772-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:16:22 user nova-compute[71972]: INFO nova.compute.manager [None req-7478356f-64c3-4042-822f-473c4436cb97 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Terminating instance Apr 17 22:16:22 user nova-compute[71972]: DEBUG nova.compute.manager [None req-7478356f-64c3-4042-822f-473c4436cb97 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Start destroying the instance on the hypervisor. {{(pid=71972) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 22:16:22 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:22 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:22 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:22 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:22 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:22 user nova-compute[71972]: DEBUG nova.compute.manager [req-a181d85d-8d70-4e9f-b8c5-7f878853a066 req-916a92fd-cfcd-4e36-abbe-d52e66d727de service nova] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Received event network-vif-unplugged-2459db09-685c-4338-848c-fc76939bf0da {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:16:22 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-a181d85d-8d70-4e9f-b8c5-7f878853a066 req-916a92fd-cfcd-4e36-abbe-d52e66d727de service nova] Acquiring lock "559768be-5a58-42c1-bbe8-e87684a0f772-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:16:22 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-a181d85d-8d70-4e9f-b8c5-7f878853a066 req-916a92fd-cfcd-4e36-abbe-d52e66d727de service nova] Lock "559768be-5a58-42c1-bbe8-e87684a0f772-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:16:22 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-a181d85d-8d70-4e9f-b8c5-7f878853a066 req-916a92fd-cfcd-4e36-abbe-d52e66d727de service nova] Lock "559768be-5a58-42c1-bbe8-e87684a0f772-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:16:22 user nova-compute[71972]: DEBUG nova.compute.manager [req-a181d85d-8d70-4e9f-b8c5-7f878853a066 req-916a92fd-cfcd-4e36-abbe-d52e66d727de service nova] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] No waiting events found dispatching network-vif-unplugged-2459db09-685c-4338-848c-fc76939bf0da {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:16:22 user nova-compute[71972]: DEBUG nova.compute.manager [req-a181d85d-8d70-4e9f-b8c5-7f878853a066 req-916a92fd-cfcd-4e36-abbe-d52e66d727de service nova] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Received event network-vif-unplugged-2459db09-685c-4338-848c-fc76939bf0da for instance with task_state deleting. {{(pid=71972) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 22:16:22 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Instance destroyed successfully. Apr 17 22:16:22 user nova-compute[71972]: DEBUG nova.objects.instance [None req-7478356f-64c3-4042-822f-473c4436cb97 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lazy-loading 'resources' on Instance uuid 559768be-5a58-42c1-bbe8-e87684a0f772 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:16:22 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-7478356f-64c3-4042-822f-473c4436cb97 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:14:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-2004521020',display_name='tempest-AttachVolumeNegativeTest-server-2004521020',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumenegativetest-server-2004521020',id=17,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEsdLKOyShDW6lsdw3PYvB4QT9qwFwLG8+mmqjbOtzoXRKJ/PazrnVxbPepdTwCtT1bx/zxUWI3ltRCf+Nv+ft4L7HR3JkoHrl3xtcd/Er5SEh21KxWlnMx/s+XIYUXdJg==',key_name='tempest-keypair-1141362177',keypairs=,launch_index=0,launched_at=2023-04-17T22:14:36Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='e52724ed9bc54905bd5eddd8504e4c77',ramdisk_id='',reservation_id='r-vgi6066b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeNegativeTest-1678965362',owner_user_name='tempest-AttachVolumeNegativeTest-1678965362-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T22:14:37Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='52b3e35c03b54ae4b5dabfb1325886a9',uuid=559768be-5a58-42c1-bbe8-e87684a0f772,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2459db09-685c-4338-848c-fc76939bf0da", "address": "fa:16:3e:25:5d:e0", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.10", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2459db09-68", "ovs_interfaceid": "2459db09-685c-4338-848c-fc76939bf0da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 22:16:22 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-7478356f-64c3-4042-822f-473c4436cb97 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Converting VIF {"id": "2459db09-685c-4338-848c-fc76939bf0da", "address": "fa:16:3e:25:5d:e0", "network": {"id": "bd9d53b7-0e22-42c4-8b91-49575cadf44f", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1068401962-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.10", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "e52724ed9bc54905bd5eddd8504e4c77", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2459db09-68", "ovs_interfaceid": "2459db09-685c-4338-848c-fc76939bf0da", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:16:22 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-7478356f-64c3-4042-822f-473c4436cb97 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:25:5d:e0,bridge_name='br-int',has_traffic_filtering=True,id=2459db09-685c-4338-848c-fc76939bf0da,network=Network(bd9d53b7-0e22-42c4-8b91-49575cadf44f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2459db09-68') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:16:22 user nova-compute[71972]: DEBUG os_vif [None req-7478356f-64c3-4042-822f-473c4436cb97 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:25:5d:e0,bridge_name='br-int',has_traffic_filtering=True,id=2459db09-685c-4338-848c-fc76939bf0da,network=Network(bd9d53b7-0e22-42c4-8b91-49575cadf44f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2459db09-68') {{(pid=71972) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 22:16:22 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:22 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2459db09-68, bridge=br-int, if_exists=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:16:22 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:22 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:16:22 user nova-compute[71972]: INFO os_vif [None req-7478356f-64c3-4042-822f-473c4436cb97 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:25:5d:e0,bridge_name='br-int',has_traffic_filtering=True,id=2459db09-685c-4338-848c-fc76939bf0da,network=Network(bd9d53b7-0e22-42c4-8b91-49575cadf44f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2459db09-68') Apr 17 22:16:22 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-7478356f-64c3-4042-822f-473c4436cb97 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Deleting instance files /opt/stack/data/nova/instances/559768be-5a58-42c1-bbe8-e87684a0f772_del Apr 17 22:16:22 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-7478356f-64c3-4042-822f-473c4436cb97 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Deletion of /opt/stack/data/nova/instances/559768be-5a58-42c1-bbe8-e87684a0f772_del complete Apr 17 22:16:22 user nova-compute[71972]: INFO nova.compute.manager [None req-7478356f-64c3-4042-822f-473c4436cb97 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Took 0.67 seconds to destroy the instance on the hypervisor. Apr 17 22:16:22 user nova-compute[71972]: DEBUG oslo.service.loopingcall [None req-7478356f-64c3-4042-822f-473c4436cb97 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71972) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 22:16:22 user nova-compute[71972]: DEBUG nova.compute.manager [-] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Deallocating network for instance {{(pid=71972) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 22:16:22 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] deallocate_for_instance() {{(pid=71972) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 22:16:24 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:16:24 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Took 1.27 seconds to deallocate network for instance. Apr 17 22:16:24 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7478356f-64c3-4042-822f-473c4436cb97 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:16:24 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7478356f-64c3-4042-822f-473c4436cb97 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:16:24 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:24 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-7478356f-64c3-4042-822f-473c4436cb97 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:16:24 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-7478356f-64c3-4042-822f-473c4436cb97 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:16:24 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7478356f-64c3-4042-822f-473c4436cb97 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.215s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:16:24 user nova-compute[71972]: INFO nova.scheduler.client.report [None req-7478356f-64c3-4042-822f-473c4436cb97 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Deleted allocations for instance 559768be-5a58-42c1-bbe8-e87684a0f772 Apr 17 22:16:24 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7478356f-64c3-4042-822f-473c4436cb97 tempest-AttachVolumeNegativeTest-1678965362 tempest-AttachVolumeNegativeTest-1678965362-project-member] Lock "559768be-5a58-42c1-bbe8-e87684a0f772" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.361s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:16:24 user nova-compute[71972]: DEBUG nova.compute.manager [req-ca823a33-84a8-479f-9093-77998ffb810e req-e3cb9993-ce66-4064-b655-e887fbfd11f4 service nova] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Received event network-vif-plugged-2459db09-685c-4338-848c-fc76939bf0da {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:16:24 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-ca823a33-84a8-479f-9093-77998ffb810e req-e3cb9993-ce66-4064-b655-e887fbfd11f4 service nova] Acquiring lock "559768be-5a58-42c1-bbe8-e87684a0f772-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:16:24 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-ca823a33-84a8-479f-9093-77998ffb810e req-e3cb9993-ce66-4064-b655-e887fbfd11f4 service nova] Lock "559768be-5a58-42c1-bbe8-e87684a0f772-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:16:24 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-ca823a33-84a8-479f-9093-77998ffb810e req-e3cb9993-ce66-4064-b655-e887fbfd11f4 service nova] Lock "559768be-5a58-42c1-bbe8-e87684a0f772-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:16:24 user nova-compute[71972]: DEBUG nova.compute.manager [req-ca823a33-84a8-479f-9093-77998ffb810e req-e3cb9993-ce66-4064-b655-e887fbfd11f4 service nova] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] No waiting events found dispatching network-vif-plugged-2459db09-685c-4338-848c-fc76939bf0da {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:16:24 user nova-compute[71972]: WARNING nova.compute.manager [req-ca823a33-84a8-479f-9093-77998ffb810e req-e3cb9993-ce66-4064-b655-e887fbfd11f4 service nova] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Received unexpected event network-vif-plugged-2459db09-685c-4338-848c-fc76939bf0da for instance with vm_state deleted and task_state None. Apr 17 22:16:24 user nova-compute[71972]: DEBUG nova.compute.manager [req-ca823a33-84a8-479f-9093-77998ffb810e req-e3cb9993-ce66-4064-b655-e887fbfd11f4 service nova] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Received event network-vif-deleted-2459db09-685c-4338-848c-fc76939bf0da {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:16:27 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:27 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:28 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:37 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:16:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:16:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:16:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:16:37 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Auditing locally available compute resources for user (node: user) {{(pid=71972) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 22:16:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:16:37 user nova-compute[71972]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:16:37 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] VM Stopped (Lifecycle Event) Apr 17 22:16:37 user nova-compute[71972]: DEBUG nova.compute.manager [None req-d869a1d7-8ae8-4ba0-bdbb-b3420487be07 None None] [instance: 559768be-5a58-42c1-bbe8-e87684a0f772] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:16:37 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:16:37 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:16:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:16:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:16:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:16:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:16:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e/disk --force-share --output=json" returned: 0 in 0.149s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:16:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:16:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:16:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:16:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:16:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ae185464-abd9-412d-bde1-d667c074abf8/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:16:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ae185464-abd9-412d-bde1-d667c074abf8/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:16:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ae185464-abd9-412d-bde1-d667c074abf8/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:16:38 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ae185464-abd9-412d-bde1-d667c074abf8/disk --force-share --output=json" returned: 0 in 0.152s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:16:39 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:16:39 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:16:39 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Hypervisor/Node resource view: name=user free_ram=8624MB free_disk=26.58519744873047GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71972) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 22:16:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:16:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:16:39 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 7bb0580b-727f-4168-9d56-56dcb4fa404e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:16:39 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 99cb7131-abb8-41d6-bddd-c3bc943b7678 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:16:39 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance ae185464-abd9-412d-bde1-d667c074abf8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:16:39 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance ac56fdf0-5568-434b-ad61-805634c2beeb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:16:39 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Total usable vcpus: 12, total allocated vcpus: 4 {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 22:16:39 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Final resource view: name=user phys_ram=16023MB used_ram=1024MB phys_disk=40GB used_disk=4GB total_vcpus=12 used_vcpus=4 pci_stats=[] {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 22:16:39 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:16:39 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:16:39 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Compute_service record updated for user:user {{(pid=71972) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 22:16:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.327s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:16:39 user nova-compute[71972]: DEBUG nova.compute.manager [req-3e0061bf-9af6-4fbd-a573-b162c0f0e00f req-c44571c0-a549-47d3-8f42-03d059e44b42 service nova] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Received event network-changed-663fa6c4-7d6d-4083-8ceb-7d08ec368373 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:16:39 user nova-compute[71972]: DEBUG nova.compute.manager [req-3e0061bf-9af6-4fbd-a573-b162c0f0e00f req-c44571c0-a549-47d3-8f42-03d059e44b42 service nova] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Refreshing instance network info cache due to event network-changed-663fa6c4-7d6d-4083-8ceb-7d08ec368373. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:16:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-3e0061bf-9af6-4fbd-a573-b162c0f0e00f req-c44571c0-a549-47d3-8f42-03d059e44b42 service nova] Acquiring lock "refresh_cache-ae185464-abd9-412d-bde1-d667c074abf8" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:16:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-3e0061bf-9af6-4fbd-a573-b162c0f0e00f req-c44571c0-a549-47d3-8f42-03d059e44b42 service nova] Acquired lock "refresh_cache-ae185464-abd9-412d-bde1-d667c074abf8" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:16:39 user nova-compute[71972]: DEBUG nova.network.neutron [req-3e0061bf-9af6-4fbd-a573-b162c0f0e00f req-c44571c0-a549-47d3-8f42-03d059e44b42 service nova] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Refreshing network info cache for port 663fa6c4-7d6d-4083-8ceb-7d08ec368373 {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:16:40 user nova-compute[71972]: DEBUG nova.network.neutron [req-3e0061bf-9af6-4fbd-a573-b162c0f0e00f req-c44571c0-a549-47d3-8f42-03d059e44b42 service nova] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Updated VIF entry in instance network info cache for port 663fa6c4-7d6d-4083-8ceb-7d08ec368373. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:16:40 user nova-compute[71972]: DEBUG nova.network.neutron [req-3e0061bf-9af6-4fbd-a573-b162c0f0e00f req-c44571c0-a549-47d3-8f42-03d059e44b42 service nova] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Updating instance_info_cache with network_info: [{"id": "663fa6c4-7d6d-4083-8ceb-7d08ec368373", "address": "fa:16:3e:e9:cb:55", "network": {"id": "8ffc4041-9c9c-4b0f-9342-dcd76540515a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-82115333-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.57", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "19f2a3034fe9458682e1242c91e2ce45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap663fa6c4-7d", "ovs_interfaceid": "663fa6c4-7d6d-4083-8ceb-7d08ec368373", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:16:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-3e0061bf-9af6-4fbd-a573-b162c0f0e00f req-c44571c0-a549-47d3-8f42-03d059e44b42 service nova] Releasing lock "refresh_cache-ae185464-abd9-412d-bde1-d667c074abf8" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:16:40 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:16:40 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:16:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5d4d94d9-98cf-4448-835d-d3b9337d9e8d tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Acquiring lock "ae185464-abd9-412d-bde1-d667c074abf8" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:16:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5d4d94d9-98cf-4448-835d-d3b9337d9e8d tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "ae185464-abd9-412d-bde1-d667c074abf8" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:16:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5d4d94d9-98cf-4448-835d-d3b9337d9e8d tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Acquiring lock "ae185464-abd9-412d-bde1-d667c074abf8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:16:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5d4d94d9-98cf-4448-835d-d3b9337d9e8d tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "ae185464-abd9-412d-bde1-d667c074abf8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:16:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5d4d94d9-98cf-4448-835d-d3b9337d9e8d tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "ae185464-abd9-412d-bde1-d667c074abf8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:16:41 user nova-compute[71972]: INFO nova.compute.manager [None req-5d4d94d9-98cf-4448-835d-d3b9337d9e8d tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Terminating instance Apr 17 22:16:41 user nova-compute[71972]: DEBUG nova.compute.manager [None req-5d4d94d9-98cf-4448-835d-d3b9337d9e8d tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Start destroying the instance on the hypervisor. {{(pid=71972) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 22:16:41 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:41 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:41 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:41 user nova-compute[71972]: DEBUG nova.compute.manager [req-7372b037-af65-4cda-8060-6e67603be3db req-18de933e-4f89-4385-94d8-a660efb882ab service nova] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Received event network-vif-unplugged-663fa6c4-7d6d-4083-8ceb-7d08ec368373 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:16:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-7372b037-af65-4cda-8060-6e67603be3db req-18de933e-4f89-4385-94d8-a660efb882ab service nova] Acquiring lock "ae185464-abd9-412d-bde1-d667c074abf8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:16:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-7372b037-af65-4cda-8060-6e67603be3db req-18de933e-4f89-4385-94d8-a660efb882ab service nova] Lock "ae185464-abd9-412d-bde1-d667c074abf8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:16:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-7372b037-af65-4cda-8060-6e67603be3db req-18de933e-4f89-4385-94d8-a660efb882ab service nova] Lock "ae185464-abd9-412d-bde1-d667c074abf8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:16:41 user nova-compute[71972]: DEBUG nova.compute.manager [req-7372b037-af65-4cda-8060-6e67603be3db req-18de933e-4f89-4385-94d8-a660efb882ab service nova] [instance: ae185464-abd9-412d-bde1-d667c074abf8] No waiting events found dispatching network-vif-unplugged-663fa6c4-7d6d-4083-8ceb-7d08ec368373 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:16:41 user nova-compute[71972]: DEBUG nova.compute.manager [req-7372b037-af65-4cda-8060-6e67603be3db req-18de933e-4f89-4385-94d8-a660efb882ab service nova] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Received event network-vif-unplugged-663fa6c4-7d6d-4083-8ceb-7d08ec368373 for instance with task_state deleting. {{(pid=71972) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 22:16:42 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Instance destroyed successfully. Apr 17 22:16:42 user nova-compute[71972]: DEBUG nova.objects.instance [None req-5d4d94d9-98cf-4448-835d-d3b9337d9e8d tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lazy-loading 'resources' on Instance uuid ae185464-abd9-412d-bde1-d667c074abf8 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:16:42 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-5d4d94d9-98cf-4448-835d-d3b9337d9e8d tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:14:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1425692942',display_name='tempest-AttachVolumeTestJSON-server-1425692942',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-1425692942',id=18,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEigsT6gLV0q/K0pHew8brR9QnMOnMtp0npXVhkRkqKTM+AOv4+ez0mMtBSy/IoES+GPDYkutnXc9/wVhggYNrjJV46CJK3m/8xwidBZV1wpxHBatJMpbDsmiChUV8RfCg==',key_name='tempest-keypair-192135788',keypairs=,launch_index=0,launched_at=2023-04-17T22:14:54Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='19f2a3034fe9458682e1242c91e2ce45',ramdisk_id='',reservation_id='r-e4k2g86x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeTestJSON-4448958',owner_user_name='tempest-AttachVolumeTestJSON-4448958-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T22:14:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='30c12a4244db438ea682e545c378abe1',uuid=ae185464-abd9-412d-bde1-d667c074abf8,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "663fa6c4-7d6d-4083-8ceb-7d08ec368373", "address": "fa:16:3e:e9:cb:55", "network": {"id": "8ffc4041-9c9c-4b0f-9342-dcd76540515a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-82115333-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.57", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "19f2a3034fe9458682e1242c91e2ce45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap663fa6c4-7d", "ovs_interfaceid": "663fa6c4-7d6d-4083-8ceb-7d08ec368373", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 22:16:42 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-5d4d94d9-98cf-4448-835d-d3b9337d9e8d tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Converting VIF {"id": "663fa6c4-7d6d-4083-8ceb-7d08ec368373", "address": "fa:16:3e:e9:cb:55", "network": {"id": "8ffc4041-9c9c-4b0f-9342-dcd76540515a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-82115333-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.57", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "19f2a3034fe9458682e1242c91e2ce45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap663fa6c4-7d", "ovs_interfaceid": "663fa6c4-7d6d-4083-8ceb-7d08ec368373", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:16:42 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-5d4d94d9-98cf-4448-835d-d3b9337d9e8d tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e9:cb:55,bridge_name='br-int',has_traffic_filtering=True,id=663fa6c4-7d6d-4083-8ceb-7d08ec368373,network=Network(8ffc4041-9c9c-4b0f-9342-dcd76540515a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap663fa6c4-7d') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:16:42 user nova-compute[71972]: DEBUG os_vif [None req-5d4d94d9-98cf-4448-835d-d3b9337d9e8d tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e9:cb:55,bridge_name='br-int',has_traffic_filtering=True,id=663fa6c4-7d6d-4083-8ceb-7d08ec368373,network=Network(8ffc4041-9c9c-4b0f-9342-dcd76540515a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap663fa6c4-7d') {{(pid=71972) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 22:16:42 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:42 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap663fa6c4-7d, bridge=br-int, if_exists=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:16:42 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:42 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:16:42 user nova-compute[71972]: INFO os_vif [None req-5d4d94d9-98cf-4448-835d-d3b9337d9e8d tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e9:cb:55,bridge_name='br-int',has_traffic_filtering=True,id=663fa6c4-7d6d-4083-8ceb-7d08ec368373,network=Network(8ffc4041-9c9c-4b0f-9342-dcd76540515a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap663fa6c4-7d') Apr 17 22:16:42 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-5d4d94d9-98cf-4448-835d-d3b9337d9e8d tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Deleting instance files /opt/stack/data/nova/instances/ae185464-abd9-412d-bde1-d667c074abf8_del Apr 17 22:16:42 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-5d4d94d9-98cf-4448-835d-d3b9337d9e8d tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Deletion of /opt/stack/data/nova/instances/ae185464-abd9-412d-bde1-d667c074abf8_del complete Apr 17 22:16:42 user nova-compute[71972]: INFO nova.compute.manager [None req-5d4d94d9-98cf-4448-835d-d3b9337d9e8d tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Took 0.90 seconds to destroy the instance on the hypervisor. Apr 17 22:16:42 user nova-compute[71972]: DEBUG oslo.service.loopingcall [None req-5d4d94d9-98cf-4448-835d-d3b9337d9e8d tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71972) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 22:16:42 user nova-compute[71972]: DEBUG nova.compute.manager [-] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Deallocating network for instance {{(pid=71972) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 22:16:42 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: ae185464-abd9-412d-bde1-d667c074abf8] deallocate_for_instance() {{(pid=71972) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 22:16:42 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:16:42 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:16:42 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71972) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 22:16:42 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:42 user nova-compute[71972]: DEBUG nova.compute.manager [req-8d0f10f0-a720-4dee-a5fe-767ea81ab31f req-f2031a00-5946-459c-9723-54affa5406d1 service nova] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Received event network-vif-deleted-663fa6c4-7d6d-4083-8ceb-7d08ec368373 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:16:42 user nova-compute[71972]: INFO nova.compute.manager [req-8d0f10f0-a720-4dee-a5fe-767ea81ab31f req-f2031a00-5946-459c-9723-54affa5406d1 service nova] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Neutron deleted interface 663fa6c4-7d6d-4083-8ceb-7d08ec368373; detaching it from the instance and deleting it from the info cache Apr 17 22:16:42 user nova-compute[71972]: DEBUG nova.network.neutron [req-8d0f10f0-a720-4dee-a5fe-767ea81ab31f req-f2031a00-5946-459c-9723-54affa5406d1 service nova] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:16:42 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:16:43 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Took 0.76 seconds to deallocate network for instance. Apr 17 22:16:43 user nova-compute[71972]: DEBUG nova.compute.manager [req-8d0f10f0-a720-4dee-a5fe-767ea81ab31f req-f2031a00-5946-459c-9723-54affa5406d1 service nova] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Detach interface failed, port_id=663fa6c4-7d6d-4083-8ceb-7d08ec368373, reason: Instance ae185464-abd9-412d-bde1-d667c074abf8 could not be found. {{(pid=71972) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 17 22:16:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5d4d94d9-98cf-4448-835d-d3b9337d9e8d tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:16:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5d4d94d9-98cf-4448-835d-d3b9337d9e8d tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:16:43 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-5d4d94d9-98cf-4448-835d-d3b9337d9e8d tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:16:43 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-5d4d94d9-98cf-4448-835d-d3b9337d9e8d tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:16:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5d4d94d9-98cf-4448-835d-d3b9337d9e8d tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.178s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:16:43 user nova-compute[71972]: INFO nova.scheduler.client.report [None req-5d4d94d9-98cf-4448-835d-d3b9337d9e8d tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Deleted allocations for instance ae185464-abd9-412d-bde1-d667c074abf8 Apr 17 22:16:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-5d4d94d9-98cf-4448-835d-d3b9337d9e8d tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "ae185464-abd9-412d-bde1-d667c074abf8" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.034s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:16:43 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:16:43 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Starting heal instance info cache {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 22:16:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "refresh_cache-99cb7131-abb8-41d6-bddd-c3bc943b7678" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:16:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquired lock "refresh_cache-99cb7131-abb8-41d6-bddd-c3bc943b7678" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:16:43 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Forcefully refreshing network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 17 22:16:43 user nova-compute[71972]: DEBUG nova.compute.manager [req-f63b202f-437d-41b8-bcff-202fa508cc32 req-0de66744-db80-4013-92f1-0d4290f5a540 service nova] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Received event network-vif-plugged-663fa6c4-7d6d-4083-8ceb-7d08ec368373 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:16:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-f63b202f-437d-41b8-bcff-202fa508cc32 req-0de66744-db80-4013-92f1-0d4290f5a540 service nova] Acquiring lock "ae185464-abd9-412d-bde1-d667c074abf8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:16:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-f63b202f-437d-41b8-bcff-202fa508cc32 req-0de66744-db80-4013-92f1-0d4290f5a540 service nova] Lock "ae185464-abd9-412d-bde1-d667c074abf8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:16:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-f63b202f-437d-41b8-bcff-202fa508cc32 req-0de66744-db80-4013-92f1-0d4290f5a540 service nova] Lock "ae185464-abd9-412d-bde1-d667c074abf8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:16:43 user nova-compute[71972]: DEBUG nova.compute.manager [req-f63b202f-437d-41b8-bcff-202fa508cc32 req-0de66744-db80-4013-92f1-0d4290f5a540 service nova] [instance: ae185464-abd9-412d-bde1-d667c074abf8] No waiting events found dispatching network-vif-plugged-663fa6c4-7d6d-4083-8ceb-7d08ec368373 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:16:43 user nova-compute[71972]: WARNING nova.compute.manager [req-f63b202f-437d-41b8-bcff-202fa508cc32 req-0de66744-db80-4013-92f1-0d4290f5a540 service nova] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Received unexpected event network-vif-plugged-663fa6c4-7d6d-4083-8ceb-7d08ec368373 for instance with vm_state deleted and task_state None. Apr 17 22:16:44 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Updating instance_info_cache with network_info: [{"id": "aede8066-45b3-4414-98a0-50dda5a4ee66", "address": "fa:16:3e:be:3c:02", "network": {"id": "966f14e1-bfad-4b86-921c-0f8f5ad29a5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1196755439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.137", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b99089f4e3074ee7a5c1ada03ceb8984", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaede8066-45", "ovs_interfaceid": "aede8066-45b3-4414-98a0-50dda5a4ee66", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:16:44 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Releasing lock "refresh_cache-99cb7131-abb8-41d6-bddd-c3bc943b7678" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:16:44 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Updated the network info_cache for instance {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 17 22:16:44 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:16:45 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:16:45 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:16:45 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:16:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:49 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:52 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:52 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:53 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:57 user nova-compute[71972]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:16:57 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: ae185464-abd9-412d-bde1-d667c074abf8] VM Stopped (Lifecycle Event) Apr 17 22:16:57 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:57 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:57 user nova-compute[71972]: DEBUG nova.compute.manager [None req-5a00b947-4eef-490a-a85d-556ab54719c0 None None] [instance: ae185464-abd9-412d-bde1-d667c074abf8] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:16:57 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:16:59 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:02 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Acquiring lock "4c1a87de-805e-4878-bd87-56b203e510de" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:17:02 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Lock "4c1a87de-805e-4878-bd87-56b203e510de" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:17:02 user nova-compute[71972]: DEBUG nova.compute.manager [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Starting instance... {{(pid=71972) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 22:17:02 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:17:02 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:17:02 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71972) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 22:17:02 user nova-compute[71972]: INFO nova.compute.claims [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Claim successful on node user Apr 17 22:17:02 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:17:02 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:17:02 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.230s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:17:02 user nova-compute[71972]: DEBUG nova.compute.manager [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Start building networks asynchronously for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 22:17:02 user nova-compute[71972]: DEBUG nova.compute.manager [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Allocating IP information in the background. {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 22:17:02 user nova-compute[71972]: DEBUG nova.network.neutron [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] allocate_for_instance() {{(pid=71972) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 22:17:02 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 22:17:02 user nova-compute[71972]: DEBUG nova.compute.manager [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Start building block device mappings for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 22:17:02 user nova-compute[71972]: DEBUG nova.policy [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'daf86cb2718140b09cbc79de9e54efb6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b4f971b7fb95478e83cf0bc9f0eb92d5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71972) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 22:17:02 user nova-compute[71972]: DEBUG nova.compute.manager [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Start spawning the instance on the hypervisor. {{(pid=71972) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 22:17:02 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Creating instance directory {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 22:17:02 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Creating image(s) Apr 17 22:17:02 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Acquiring lock "/opt/stack/data/nova/instances/4c1a87de-805e-4878-bd87-56b203e510de/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:17:02 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Lock "/opt/stack/data/nova/instances/4c1a87de-805e-4878-bd87-56b203e510de/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:17:02 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Lock "/opt/stack/data/nova/instances/4c1a87de-805e-4878-bd87-56b203e510de/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:17:02 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:17:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:02 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.135s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:17:02 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Acquiring lock "fde46b8e739fd6213d1525690f3ccf27384ee720" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:17:02 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:17:02 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:17:03 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.132s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:17:03 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/4c1a87de-805e-4878-bd87-56b203e510de/disk 1073741824 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:17:03 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/4c1a87de-805e-4878-bd87-56b203e510de/disk 1073741824" returned: 0 in 0.045s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:17:03 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.184s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:17:03 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:17:03 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.141s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:17:03 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Checking if we can resize image /opt/stack/data/nova/instances/4c1a87de-805e-4878-bd87-56b203e510de/disk. size=1073741824 {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 22:17:03 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4c1a87de-805e-4878-bd87-56b203e510de/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:17:03 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4c1a87de-805e-4878-bd87-56b203e510de/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:17:03 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Cannot resize image /opt/stack/data/nova/instances/4c1a87de-805e-4878-bd87-56b203e510de/disk to a smaller size. {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 22:17:03 user nova-compute[71972]: DEBUG nova.objects.instance [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Lazy-loading 'migration_context' on Instance uuid 4c1a87de-805e-4878-bd87-56b203e510de {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:17:03 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Created local disks {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 22:17:03 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Ensure instance console log exists: /opt/stack/data/nova/instances/4c1a87de-805e-4878-bd87-56b203e510de/console.log {{(pid=71972) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 22:17:03 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:17:03 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:17:03 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:17:03 user nova-compute[71972]: DEBUG nova.network.neutron [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Successfully created port: 7d86402d-0aa3-49a9-9e27-8b623fd9b33a {{(pid=71972) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.network.neutron [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Successfully updated port: 7d86402d-0aa3-49a9-9e27-8b623fd9b33a {{(pid=71972) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Acquiring lock "refresh_cache-4c1a87de-805e-4878-bd87-56b203e510de" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Acquired lock "refresh_cache-4c1a87de-805e-4878-bd87-56b203e510de" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.network.neutron [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Building network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.compute.manager [req-f9ab5de8-dbbe-4db4-a9b7-5de76b4bf797 req-0a4c3ad4-1bb3-4dc4-8282-a1b48da96100 service nova] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Received event network-changed-7d86402d-0aa3-49a9-9e27-8b623fd9b33a {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.compute.manager [req-f9ab5de8-dbbe-4db4-a9b7-5de76b4bf797 req-0a4c3ad4-1bb3-4dc4-8282-a1b48da96100 service nova] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Refreshing instance network info cache due to event network-changed-7d86402d-0aa3-49a9-9e27-8b623fd9b33a. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-f9ab5de8-dbbe-4db4-a9b7-5de76b4bf797 req-0a4c3ad4-1bb3-4dc4-8282-a1b48da96100 service nova] Acquiring lock "refresh_cache-4c1a87de-805e-4878-bd87-56b203e510de" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.network.neutron [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Instance cache missing network info. {{(pid=71972) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.network.neutron [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Updating instance_info_cache with network_info: [{"id": "7d86402d-0aa3-49a9-9e27-8b623fd9b33a", "address": "fa:16:3e:74:a3:6e", "network": {"id": "5a0defe9-b217-4eee-8448-70ad161a6de1", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-10075044-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b4f971b7fb95478e83cf0bc9f0eb92d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d86402d-0a", "ovs_interfaceid": "7d86402d-0aa3-49a9-9e27-8b623fd9b33a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Releasing lock "refresh_cache-4c1a87de-805e-4878-bd87-56b203e510de" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.compute.manager [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Instance network_info: |[{"id": "7d86402d-0aa3-49a9-9e27-8b623fd9b33a", "address": "fa:16:3e:74:a3:6e", "network": {"id": "5a0defe9-b217-4eee-8448-70ad161a6de1", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-10075044-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b4f971b7fb95478e83cf0bc9f0eb92d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d86402d-0a", "ovs_interfaceid": "7d86402d-0aa3-49a9-9e27-8b623fd9b33a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-f9ab5de8-dbbe-4db4-a9b7-5de76b4bf797 req-0a4c3ad4-1bb3-4dc4-8282-a1b48da96100 service nova] Acquired lock "refresh_cache-4c1a87de-805e-4878-bd87-56b203e510de" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.network.neutron [req-f9ab5de8-dbbe-4db4-a9b7-5de76b4bf797 req-0a4c3ad4-1bb3-4dc4-8282-a1b48da96100 service nova] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Refreshing network info cache for port 7d86402d-0aa3-49a9-9e27-8b623fd9b33a {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Start _get_guest_xml network_info=[{"id": "7d86402d-0aa3-49a9-9e27-8b623fd9b33a", "address": "fa:16:3e:74:a3:6e", "network": {"id": "5a0defe9-b217-4eee-8448-70ad161a6de1", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-10075044-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b4f971b7fb95478e83cf0bc9f0eb92d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d86402d-0a", "ovs_interfaceid": "7d86402d-0aa3-49a9-9e27-8b623fd9b33a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'image_id': '80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 22:17:04 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:17:04 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71972) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T22:06:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=), allow threads: True {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Flavor limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Image limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Flavor pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Image pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Got 1 possible topologies {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:17:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SnapshotDataIntegrityTests-server-308754079',display_name='tempest-SnapshotDataIntegrityTests-server-308754079',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-snapshotdataintegritytests-server-308754079',id=20,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFOG3Ac3W7UP7GHDYmWNPhQ0orTfJAryPIfYEeiXGqep9uRC0B5AhMF07UhTVg0hHnzoWVpOM3/yTD0/ALjSUcHNKu/VnX9AxdKs6qx6/qDtrLYFsp1gM6fgmpRc33N0FQ==',key_name='tempest-SnapshotDataIntegrityTests-1740085900',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b4f971b7fb95478e83cf0bc9f0eb92d5',ramdisk_id='',reservation_id='r-k9pofijj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-SnapshotDataIntegrityTests-1966210151',owner_user_name='tempest-SnapshotDataIntegrityTests-1966210151-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:17:03Z,user_data=None,user_id='daf86cb2718140b09cbc79de9e54efb6',uuid=4c1a87de-805e-4878-bd87-56b203e510de,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7d86402d-0aa3-49a9-9e27-8b623fd9b33a", "address": "fa:16:3e:74:a3:6e", "network": {"id": "5a0defe9-b217-4eee-8448-70ad161a6de1", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-10075044-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b4f971b7fb95478e83cf0bc9f0eb92d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d86402d-0a", "ovs_interfaceid": "7d86402d-0aa3-49a9-9e27-8b623fd9b33a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71972) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Converting VIF {"id": "7d86402d-0aa3-49a9-9e27-8b623fd9b33a", "address": "fa:16:3e:74:a3:6e", "network": {"id": "5a0defe9-b217-4eee-8448-70ad161a6de1", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-10075044-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b4f971b7fb95478e83cf0bc9f0eb92d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d86402d-0a", "ovs_interfaceid": "7d86402d-0aa3-49a9-9e27-8b623fd9b33a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:a3:6e,bridge_name='br-int',has_traffic_filtering=True,id=7d86402d-0aa3-49a9-9e27-8b623fd9b33a,network=Network(5a0defe9-b217-4eee-8448-70ad161a6de1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d86402d-0a') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.objects.instance [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Lazy-loading 'pci_devices' on Instance uuid 4c1a87de-805e-4878-bd87-56b203e510de {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] End _get_guest_xml xml= Apr 17 22:17:04 user nova-compute[71972]: 4c1a87de-805e-4878-bd87-56b203e510de Apr 17 22:17:04 user nova-compute[71972]: instance-00000014 Apr 17 22:17:04 user nova-compute[71972]: 131072 Apr 17 22:17:04 user nova-compute[71972]: 1 Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: tempest-SnapshotDataIntegrityTests-server-308754079 Apr 17 22:17:04 user nova-compute[71972]: 2023-04-17 22:17:04 Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: 128 Apr 17 22:17:04 user nova-compute[71972]: 1 Apr 17 22:17:04 user nova-compute[71972]: 0 Apr 17 22:17:04 user nova-compute[71972]: 0 Apr 17 22:17:04 user nova-compute[71972]: 1 Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: tempest-SnapshotDataIntegrityTests-1966210151-project-member Apr 17 22:17:04 user nova-compute[71972]: tempest-SnapshotDataIntegrityTests-1966210151 Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: OpenStack Foundation Apr 17 22:17:04 user nova-compute[71972]: OpenStack Nova Apr 17 22:17:04 user nova-compute[71972]: 0.0.0 Apr 17 22:17:04 user nova-compute[71972]: 4c1a87de-805e-4878-bd87-56b203e510de Apr 17 22:17:04 user nova-compute[71972]: 4c1a87de-805e-4878-bd87-56b203e510de Apr 17 22:17:04 user nova-compute[71972]: Virtual Machine Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: hvm Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Nehalem Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: /dev/urandom Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: Apr 17 22:17:04 user nova-compute[71972]: {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:17:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-SnapshotDataIntegrityTests-server-308754079',display_name='tempest-SnapshotDataIntegrityTests-server-308754079',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-snapshotdataintegritytests-server-308754079',id=20,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFOG3Ac3W7UP7GHDYmWNPhQ0orTfJAryPIfYEeiXGqep9uRC0B5AhMF07UhTVg0hHnzoWVpOM3/yTD0/ALjSUcHNKu/VnX9AxdKs6qx6/qDtrLYFsp1gM6fgmpRc33N0FQ==',key_name='tempest-SnapshotDataIntegrityTests-1740085900',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b4f971b7fb95478e83cf0bc9f0eb92d5',ramdisk_id='',reservation_id='r-k9pofijj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-SnapshotDataIntegrityTests-1966210151',owner_user_name='tempest-SnapshotDataIntegrityTests-1966210151-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:17:03Z,user_data=None,user_id='daf86cb2718140b09cbc79de9e54efb6',uuid=4c1a87de-805e-4878-bd87-56b203e510de,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7d86402d-0aa3-49a9-9e27-8b623fd9b33a", "address": "fa:16:3e:74:a3:6e", "network": {"id": "5a0defe9-b217-4eee-8448-70ad161a6de1", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-10075044-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b4f971b7fb95478e83cf0bc9f0eb92d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d86402d-0a", "ovs_interfaceid": "7d86402d-0aa3-49a9-9e27-8b623fd9b33a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Converting VIF {"id": "7d86402d-0aa3-49a9-9e27-8b623fd9b33a", "address": "fa:16:3e:74:a3:6e", "network": {"id": "5a0defe9-b217-4eee-8448-70ad161a6de1", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-10075044-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b4f971b7fb95478e83cf0bc9f0eb92d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d86402d-0a", "ovs_interfaceid": "7d86402d-0aa3-49a9-9e27-8b623fd9b33a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:a3:6e,bridge_name='br-int',has_traffic_filtering=True,id=7d86402d-0aa3-49a9-9e27-8b623fd9b33a,network=Network(5a0defe9-b217-4eee-8448-70ad161a6de1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d86402d-0a') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG os_vif [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:a3:6e,bridge_name='br-int',has_traffic_filtering=True,id=7d86402d-0aa3-49a9-9e27-8b623fd9b33a,network=Network(5a0defe9-b217-4eee-8448-70ad161a6de1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d86402d-0a') {{(pid=71972) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7d86402d-0a, may_exist=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7d86402d-0a, col_values=(('external_ids', {'iface-id': '7d86402d-0aa3-49a9-9e27-8b623fd9b33a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:74:a3:6e', 'vm-uuid': '4c1a87de-805e-4878-bd87-56b203e510de'}),)) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:04 user nova-compute[71972]: INFO os_vif [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:a3:6e,bridge_name='br-int',has_traffic_filtering=True,id=7d86402d-0aa3-49a9-9e27-8b623fd9b33a,network=Network(5a0defe9-b217-4eee-8448-70ad161a6de1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d86402d-0a') Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] No BDM found with device name vda, not building metadata. {{(pid=71972) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 22:17:04 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] No VIF found with MAC fa:16:3e:74:a3:6e, not building metadata {{(pid=71972) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 22:17:05 user nova-compute[71972]: DEBUG nova.network.neutron [req-f9ab5de8-dbbe-4db4-a9b7-5de76b4bf797 req-0a4c3ad4-1bb3-4dc4-8282-a1b48da96100 service nova] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Updated VIF entry in instance network info cache for port 7d86402d-0aa3-49a9-9e27-8b623fd9b33a. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:17:05 user nova-compute[71972]: DEBUG nova.network.neutron [req-f9ab5de8-dbbe-4db4-a9b7-5de76b4bf797 req-0a4c3ad4-1bb3-4dc4-8282-a1b48da96100 service nova] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Updating instance_info_cache with network_info: [{"id": "7d86402d-0aa3-49a9-9e27-8b623fd9b33a", "address": "fa:16:3e:74:a3:6e", "network": {"id": "5a0defe9-b217-4eee-8448-70ad161a6de1", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-10075044-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b4f971b7fb95478e83cf0bc9f0eb92d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d86402d-0a", "ovs_interfaceid": "7d86402d-0aa3-49a9-9e27-8b623fd9b33a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:17:05 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-f9ab5de8-dbbe-4db4-a9b7-5de76b4bf797 req-0a4c3ad4-1bb3-4dc4-8282-a1b48da96100 service nova] Releasing lock "refresh_cache-4c1a87de-805e-4878-bd87-56b203e510de" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:17:06 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:06 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:06 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:06 user nova-compute[71972]: DEBUG nova.compute.manager [req-6c87be73-20a6-4f98-bb30-7bde1f6c1a74 req-549805ce-6071-4c8a-b556-391327b1027c service nova] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Received event network-vif-plugged-7d86402d-0aa3-49a9-9e27-8b623fd9b33a {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:17:06 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-6c87be73-20a6-4f98-bb30-7bde1f6c1a74 req-549805ce-6071-4c8a-b556-391327b1027c service nova] Acquiring lock "4c1a87de-805e-4878-bd87-56b203e510de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:17:06 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-6c87be73-20a6-4f98-bb30-7bde1f6c1a74 req-549805ce-6071-4c8a-b556-391327b1027c service nova] Lock "4c1a87de-805e-4878-bd87-56b203e510de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:17:06 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-6c87be73-20a6-4f98-bb30-7bde1f6c1a74 req-549805ce-6071-4c8a-b556-391327b1027c service nova] Lock "4c1a87de-805e-4878-bd87-56b203e510de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:17:06 user nova-compute[71972]: DEBUG nova.compute.manager [req-6c87be73-20a6-4f98-bb30-7bde1f6c1a74 req-549805ce-6071-4c8a-b556-391327b1027c service nova] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] No waiting events found dispatching network-vif-plugged-7d86402d-0aa3-49a9-9e27-8b623fd9b33a {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:17:06 user nova-compute[71972]: WARNING nova.compute.manager [req-6c87be73-20a6-4f98-bb30-7bde1f6c1a74 req-549805ce-6071-4c8a-b556-391327b1027c service nova] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Received unexpected event network-vif-plugged-7d86402d-0aa3-49a9-9e27-8b623fd9b33a for instance with vm_state building and task_state spawning. Apr 17 22:17:06 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:06 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:06 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:06 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:07 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:08 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Resumed> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:17:08 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] VM Resumed (Lifecycle Event) Apr 17 22:17:08 user nova-compute[71972]: DEBUG nova.compute.manager [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Instance event wait completed in 0 seconds for {{(pid=71972) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 22:17:08 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Guest created on hypervisor {{(pid=71972) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 22:17:08 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Instance spawned successfully. Apr 17 22:17:08 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 22:17:08 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:17:08 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:17:08 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Found default for hw_cdrom_bus of ide {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:17:08 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Found default for hw_disk_bus of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:17:08 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Found default for hw_input_bus of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:17:08 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Found default for hw_pointer_model of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:17:08 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Found default for hw_video_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:17:08 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Found default for hw_vif_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:17:08 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:17:08 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Started> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:17:08 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] VM Started (Lifecycle Event) Apr 17 22:17:08 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:17:08 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:17:08 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:17:08 user nova-compute[71972]: INFO nova.compute.manager [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Took 5.55 seconds to spawn the instance on the hypervisor. Apr 17 22:17:08 user nova-compute[71972]: DEBUG nova.compute.manager [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:17:08 user nova-compute[71972]: INFO nova.compute.manager [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Took 6.08 seconds to build instance. Apr 17 22:17:08 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-3dd61f76-9b76-4bb4-b473-90fc6ab04db8 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Lock "4c1a87de-805e-4878-bd87-56b203e510de" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.181s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:17:08 user nova-compute[71972]: DEBUG nova.compute.manager [req-14b06311-43b3-4e3c-a6cb-c7840485ebb6 req-d2508aa4-b218-44f1-9fb5-52bf94258dcd service nova] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Received event network-vif-plugged-7d86402d-0aa3-49a9-9e27-8b623fd9b33a {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:17:08 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-14b06311-43b3-4e3c-a6cb-c7840485ebb6 req-d2508aa4-b218-44f1-9fb5-52bf94258dcd service nova] Acquiring lock "4c1a87de-805e-4878-bd87-56b203e510de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:17:08 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-14b06311-43b3-4e3c-a6cb-c7840485ebb6 req-d2508aa4-b218-44f1-9fb5-52bf94258dcd service nova] Lock "4c1a87de-805e-4878-bd87-56b203e510de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:17:08 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-14b06311-43b3-4e3c-a6cb-c7840485ebb6 req-d2508aa4-b218-44f1-9fb5-52bf94258dcd service nova] Lock "4c1a87de-805e-4878-bd87-56b203e510de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:17:08 user nova-compute[71972]: DEBUG nova.compute.manager [req-14b06311-43b3-4e3c-a6cb-c7840485ebb6 req-d2508aa4-b218-44f1-9fb5-52bf94258dcd service nova] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] No waiting events found dispatching network-vif-plugged-7d86402d-0aa3-49a9-9e27-8b623fd9b33a {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:17:08 user nova-compute[71972]: WARNING nova.compute.manager [req-14b06311-43b3-4e3c-a6cb-c7840485ebb6 req-d2508aa4-b218-44f1-9fb5-52bf94258dcd service nova] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Received unexpected event network-vif-plugged-7d86402d-0aa3-49a9-9e27-8b623fd9b33a for instance with vm_state active and task_state None. Apr 17 22:17:09 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:12 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:14 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:16 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:19 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:22 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:24 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:27 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:28 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:28 user nova-compute[71972]: DEBUG nova.compute.manager [None req-afae4eee-0958-4a8e-a2e2-a95a5bff5270 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:17:28 user nova-compute[71972]: INFO nova.compute.manager [None req-afae4eee-0958-4a8e-a2e2-a95a5bff5270 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] instance snapshotting Apr 17 22:17:28 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-afae4eee-0958-4a8e-a2e2-a95a5bff5270 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Beginning live snapshot process Apr 17 22:17:28 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-afae4eee-0958-4a8e-a2e2-a95a5bff5270 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json -f qcow2 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:17:28 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-afae4eee-0958-4a8e-a2e2-a95a5bff5270 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json -f qcow2" returned: 0 in 0.142s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:17:28 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-afae4eee-0958-4a8e-a2e2-a95a5bff5270 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json -f qcow2 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:17:29 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-afae4eee-0958-4a8e-a2e2-a95a5bff5270 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json -f qcow2" returned: 0 in 0.146s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:17:29 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-afae4eee-0958-4a8e-a2e2-a95a5bff5270 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:17:29 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-afae4eee-0958-4a8e-a2e2-a95a5bff5270 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.134s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:17:29 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-afae4eee-0958-4a8e-a2e2-a95a5bff5270 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpp5481dxr/94705e7deb8a4d9b86316fe956c261ff.delta 1073741824 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:17:29 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-afae4eee-0958-4a8e-a2e2-a95a5bff5270 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpp5481dxr/94705e7deb8a4d9b86316fe956c261ff.delta 1073741824" returned: 0 in 0.047s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:17:29 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-afae4eee-0958-4a8e-a2e2-a95a5bff5270 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Quiescing instance not available: QEMU guest agent is not enabled. Apr 17 22:17:29 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:29 user nova-compute[71972]: DEBUG nova.virt.libvirt.guest [None req-afae4eee-0958-4a8e-a2e2-a95a5bff5270 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] COPY block job progress, current cursor: 0 final cursor: 43778048 {{(pid=71972) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 17 22:17:30 user nova-compute[71972]: DEBUG nova.virt.libvirt.guest [None req-afae4eee-0958-4a8e-a2e2-a95a5bff5270 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] COPY block job progress, current cursor: 43778048 final cursor: 43778048 {{(pid=71972) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 17 22:17:30 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-afae4eee-0958-4a8e-a2e2-a95a5bff5270 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Skipping quiescing instance: QEMU guest agent is not enabled. Apr 17 22:17:30 user nova-compute[71972]: DEBUG nova.privsep.utils [None req-afae4eee-0958-4a8e-a2e2-a95a5bff5270 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71972) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 17 22:17:30 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-afae4eee-0958-4a8e-a2e2-a95a5bff5270 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpp5481dxr/94705e7deb8a4d9b86316fe956c261ff.delta /opt/stack/data/nova/instances/snapshots/tmpp5481dxr/94705e7deb8a4d9b86316fe956c261ff {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:17:30 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-dc90fe93-6ffe-471b-b317-4188d95c4735 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Acquiring lock "7bb0580b-727f-4168-9d56-56dcb4fa404e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:17:30 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-dc90fe93-6ffe-471b-b317-4188d95c4735 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "7bb0580b-727f-4168-9d56-56dcb4fa404e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:17:30 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-dc90fe93-6ffe-471b-b317-4188d95c4735 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Acquiring lock "7bb0580b-727f-4168-9d56-56dcb4fa404e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:17:30 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-dc90fe93-6ffe-471b-b317-4188d95c4735 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "7bb0580b-727f-4168-9d56-56dcb4fa404e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:17:30 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-dc90fe93-6ffe-471b-b317-4188d95c4735 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "7bb0580b-727f-4168-9d56-56dcb4fa404e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:17:30 user nova-compute[71972]: INFO nova.compute.manager [None req-dc90fe93-6ffe-471b-b317-4188d95c4735 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Terminating instance Apr 17 22:17:30 user nova-compute[71972]: DEBUG nova.compute.manager [None req-dc90fe93-6ffe-471b-b317-4188d95c4735 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Start destroying the instance on the hypervisor. {{(pid=71972) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 22:17:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:31 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-afae4eee-0958-4a8e-a2e2-a95a5bff5270 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpp5481dxr/94705e7deb8a4d9b86316fe956c261ff.delta /opt/stack/data/nova/instances/snapshots/tmpp5481dxr/94705e7deb8a4d9b86316fe956c261ff" returned: 0 in 0.722s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:17:31 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-afae4eee-0958-4a8e-a2e2-a95a5bff5270 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Snapshot extracted, beginning image upload Apr 17 22:17:31 user nova-compute[71972]: DEBUG nova.compute.manager [req-7b896f2a-6496-4013-99ba-bcd7c456b7bd req-4ef9b720-4312-4599-a2ae-20d9ed5ba8c9 service nova] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Received event network-vif-unplugged-06548a29-a501-4b57-97f1-8afe930c8463 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:17:31 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-7b896f2a-6496-4013-99ba-bcd7c456b7bd req-4ef9b720-4312-4599-a2ae-20d9ed5ba8c9 service nova] Acquiring lock "7bb0580b-727f-4168-9d56-56dcb4fa404e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:17:31 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-7b896f2a-6496-4013-99ba-bcd7c456b7bd req-4ef9b720-4312-4599-a2ae-20d9ed5ba8c9 service nova] Lock "7bb0580b-727f-4168-9d56-56dcb4fa404e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:17:31 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-7b896f2a-6496-4013-99ba-bcd7c456b7bd req-4ef9b720-4312-4599-a2ae-20d9ed5ba8c9 service nova] Lock "7bb0580b-727f-4168-9d56-56dcb4fa404e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:17:31 user nova-compute[71972]: DEBUG nova.compute.manager [req-7b896f2a-6496-4013-99ba-bcd7c456b7bd req-4ef9b720-4312-4599-a2ae-20d9ed5ba8c9 service nova] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] No waiting events found dispatching network-vif-unplugged-06548a29-a501-4b57-97f1-8afe930c8463 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:17:31 user nova-compute[71972]: DEBUG nova.compute.manager [req-7b896f2a-6496-4013-99ba-bcd7c456b7bd req-4ef9b720-4312-4599-a2ae-20d9ed5ba8c9 service nova] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Received event network-vif-unplugged-06548a29-a501-4b57-97f1-8afe930c8463 for instance with task_state deleting. {{(pid=71972) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 22:17:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:31 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Instance destroyed successfully. Apr 17 22:17:31 user nova-compute[71972]: DEBUG nova.objects.instance [None req-dc90fe93-6ffe-471b-b317-4188d95c4735 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lazy-loading 'resources' on Instance uuid 7bb0580b-727f-4168-9d56-56dcb4fa404e {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:17:31 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-dc90fe93-6ffe-471b-b317-4188d95c4735 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:08:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServersNegativeTestJSON-server-1710989635',display_name='tempest-ServersNegativeTestJSON-server-1710989635',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serversnegativetestjson-server-1710989635',id=1,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-17T22:09:14Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='f21699c3400842d3a28e71b288a4aaff',ramdisk_id='',reservation_id='r-jifig91e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServersNegativeTestJSON-1844623378',owner_user_name='tempest-ServersNegativeTestJSON-1844623378-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T22:09:15Z,user_data=None,user_id='51c0b269c97241d9ad122b23af3ca7ea',uuid=7bb0580b-727f-4168-9d56-56dcb4fa404e,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "06548a29-a501-4b57-97f1-8afe930c8463", "address": "fa:16:3e:d4:82:7c", "network": {"id": "3bbe159c-29cd-4095-9556-8169500b1716", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-840222337-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f21699c3400842d3a28e71b288a4aaff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap06548a29-a5", "ovs_interfaceid": "06548a29-a501-4b57-97f1-8afe930c8463", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 22:17:31 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-dc90fe93-6ffe-471b-b317-4188d95c4735 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Converting VIF {"id": "06548a29-a501-4b57-97f1-8afe930c8463", "address": "fa:16:3e:d4:82:7c", "network": {"id": "3bbe159c-29cd-4095-9556-8169500b1716", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-840222337-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "f21699c3400842d3a28e71b288a4aaff", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap06548a29-a5", "ovs_interfaceid": "06548a29-a501-4b57-97f1-8afe930c8463", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:17:31 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-dc90fe93-6ffe-471b-b317-4188d95c4735 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d4:82:7c,bridge_name='br-int',has_traffic_filtering=True,id=06548a29-a501-4b57-97f1-8afe930c8463,network=Network(3bbe159c-29cd-4095-9556-8169500b1716),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06548a29-a5') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:17:31 user nova-compute[71972]: DEBUG os_vif [None req-dc90fe93-6ffe-471b-b317-4188d95c4735 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d4:82:7c,bridge_name='br-int',has_traffic_filtering=True,id=06548a29-a501-4b57-97f1-8afe930c8463,network=Network(3bbe159c-29cd-4095-9556-8169500b1716),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06548a29-a5') {{(pid=71972) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 22:17:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06548a29-a5, bridge=br-int, if_exists=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:17:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:17:31 user nova-compute[71972]: INFO os_vif [None req-dc90fe93-6ffe-471b-b317-4188d95c4735 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d4:82:7c,bridge_name='br-int',has_traffic_filtering=True,id=06548a29-a501-4b57-97f1-8afe930c8463,network=Network(3bbe159c-29cd-4095-9556-8169500b1716),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06548a29-a5') Apr 17 22:17:31 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-dc90fe93-6ffe-471b-b317-4188d95c4735 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Deleting instance files /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e_del Apr 17 22:17:31 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-dc90fe93-6ffe-471b-b317-4188d95c4735 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Deletion of /opt/stack/data/nova/instances/7bb0580b-727f-4168-9d56-56dcb4fa404e_del complete Apr 17 22:17:32 user nova-compute[71972]: INFO nova.compute.manager [None req-dc90fe93-6ffe-471b-b317-4188d95c4735 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Took 1.16 seconds to destroy the instance on the hypervisor. Apr 17 22:17:32 user nova-compute[71972]: DEBUG oslo.service.loopingcall [None req-dc90fe93-6ffe-471b-b317-4188d95c4735 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71972) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 22:17:32 user nova-compute[71972]: DEBUG nova.compute.manager [-] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Deallocating network for instance {{(pid=71972) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 22:17:32 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] deallocate_for_instance() {{(pid=71972) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 22:17:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:32 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:17:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:32 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Took 0.52 seconds to deallocate network for instance. Apr 17 22:17:32 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-dc90fe93-6ffe-471b-b317-4188d95c4735 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:17:32 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-dc90fe93-6ffe-471b-b317-4188d95c4735 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:17:32 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-dc90fe93-6ffe-471b-b317-4188d95c4735 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:17:32 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-dc90fe93-6ffe-471b-b317-4188d95c4735 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:17:32 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-dc90fe93-6ffe-471b-b317-4188d95c4735 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.186s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:17:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:32 user nova-compute[71972]: INFO nova.scheduler.client.report [None req-dc90fe93-6ffe-471b-b317-4188d95c4735 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Deleted allocations for instance 7bb0580b-727f-4168-9d56-56dcb4fa404e Apr 17 22:17:32 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-dc90fe93-6ffe-471b-b317-4188d95c4735 tempest-ServersNegativeTestJSON-1844623378 tempest-ServersNegativeTestJSON-1844623378-project-member] Lock "7bb0580b-727f-4168-9d56-56dcb4fa404e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.060s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:17:33 user nova-compute[71972]: DEBUG nova.compute.manager [req-033c020b-e315-43a7-91fc-789977a949b5 req-f2bffb9f-aa32-4c3f-8c1c-9a6a050427f2 service nova] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Received event network-vif-plugged-06548a29-a501-4b57-97f1-8afe930c8463 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:17:33 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-033c020b-e315-43a7-91fc-789977a949b5 req-f2bffb9f-aa32-4c3f-8c1c-9a6a050427f2 service nova] Acquiring lock "7bb0580b-727f-4168-9d56-56dcb4fa404e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:17:33 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-033c020b-e315-43a7-91fc-789977a949b5 req-f2bffb9f-aa32-4c3f-8c1c-9a6a050427f2 service nova] Lock "7bb0580b-727f-4168-9d56-56dcb4fa404e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:17:33 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-033c020b-e315-43a7-91fc-789977a949b5 req-f2bffb9f-aa32-4c3f-8c1c-9a6a050427f2 service nova] Lock "7bb0580b-727f-4168-9d56-56dcb4fa404e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:17:33 user nova-compute[71972]: DEBUG nova.compute.manager [req-033c020b-e315-43a7-91fc-789977a949b5 req-f2bffb9f-aa32-4c3f-8c1c-9a6a050427f2 service nova] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] No waiting events found dispatching network-vif-plugged-06548a29-a501-4b57-97f1-8afe930c8463 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:17:33 user nova-compute[71972]: WARNING nova.compute.manager [req-033c020b-e315-43a7-91fc-789977a949b5 req-f2bffb9f-aa32-4c3f-8c1c-9a6a050427f2 service nova] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Received unexpected event network-vif-plugged-06548a29-a501-4b57-97f1-8afe930c8463 for instance with vm_state deleted and task_state None. Apr 17 22:17:33 user nova-compute[71972]: DEBUG nova.compute.manager [req-033c020b-e315-43a7-91fc-789977a949b5 req-f2bffb9f-aa32-4c3f-8c1c-9a6a050427f2 service nova] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Received event network-vif-deleted-06548a29-a501-4b57-97f1-8afe930c8463 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:17:33 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-afae4eee-0958-4a8e-a2e2-a95a5bff5270 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Snapshot image upload complete Apr 17 22:17:33 user nova-compute[71972]: INFO nova.compute.manager [None req-afae4eee-0958-4a8e-a2e2-a95a5bff5270 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Took 5.28 seconds to snapshot the instance on the hypervisor. Apr 17 22:17:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Acquiring lock "d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:17:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:17:35 user nova-compute[71972]: DEBUG nova.compute.manager [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Starting instance... {{(pid=71972) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 22:17:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:17:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:17:35 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71972) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 22:17:35 user nova-compute[71972]: INFO nova.compute.claims [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Claim successful on node user Apr 17 22:17:35 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:17:35 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:17:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.259s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:17:35 user nova-compute[71972]: DEBUG nova.compute.manager [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Start building networks asynchronously for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 22:17:35 user nova-compute[71972]: DEBUG nova.compute.manager [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Allocating IP information in the background. {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 22:17:35 user nova-compute[71972]: DEBUG nova.network.neutron [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] allocate_for_instance() {{(pid=71972) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 22:17:35 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 22:17:35 user nova-compute[71972]: DEBUG nova.compute.manager [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Start building block device mappings for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 22:17:35 user nova-compute[71972]: DEBUG nova.policy [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '30c12a4244db438ea682e545c378abe1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '19f2a3034fe9458682e1242c91e2ce45', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71972) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 22:17:35 user nova-compute[71972]: DEBUG nova.compute.manager [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Start spawning the instance on the hypervisor. {{(pid=71972) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 22:17:35 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Creating instance directory {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 22:17:35 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Creating image(s) Apr 17 22:17:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Acquiring lock "/opt/stack/data/nova/instances/d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:17:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "/opt/stack/data/nova/instances/d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:17:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "/opt/stack/data/nova/instances/d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:17:35 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:17:35 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.132s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:17:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Acquiring lock "fde46b8e739fd6213d1525690f3ccf27384ee720" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:17:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:17:35 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:17:35 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.133s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:17:35 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0/disk 1073741824 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:17:35 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0/disk 1073741824" returned: 0 in 0.046s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:17:35 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.186s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:17:35 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:17:36 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.145s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:17:36 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Checking if we can resize image /opt/stack/data/nova/instances/d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0/disk. size=1073741824 {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 22:17:36 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:17:36 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0/disk --force-share --output=json" returned: 0 in 0.152s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:17:36 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Cannot resize image /opt/stack/data/nova/instances/d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0/disk to a smaller size. {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 22:17:36 user nova-compute[71972]: DEBUG nova.objects.instance [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lazy-loading 'migration_context' on Instance uuid d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:17:36 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Created local disks {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 22:17:36 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Ensure instance console log exists: /opt/stack/data/nova/instances/d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0/console.log {{(pid=71972) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 22:17:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:17:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:17:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:17:36 user nova-compute[71972]: DEBUG nova.network.neutron [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Successfully created port: 2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560 {{(pid=71972) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 22:17:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.network.neutron [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Successfully updated port: 2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560 {{(pid=71972) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Acquiring lock "refresh_cache-d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Acquired lock "refresh_cache-d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.network.neutron [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Building network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.compute.manager [req-207ded6b-c13a-4669-acd6-3721a72545ba req-03caa8c5-f47a-46d9-ba35-9932bb1cdd6b service nova] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Received event network-changed-2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.compute.manager [req-207ded6b-c13a-4669-acd6-3721a72545ba req-03caa8c5-f47a-46d9-ba35-9932bb1cdd6b service nova] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Refreshing instance network info cache due to event network-changed-2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-207ded6b-c13a-4669-acd6-3721a72545ba req-03caa8c5-f47a-46d9-ba35-9932bb1cdd6b service nova] Acquiring lock "refresh_cache-d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.network.neutron [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Instance cache missing network info. {{(pid=71972) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.network.neutron [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Updating instance_info_cache with network_info: [{"id": "2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560", "address": "fa:16:3e:f4:40:f4", "network": {"id": "8ffc4041-9c9c-4b0f-9342-dcd76540515a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-82115333-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "19f2a3034fe9458682e1242c91e2ce45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fd0b7cd-39", "ovs_interfaceid": "2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Releasing lock "refresh_cache-d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.compute.manager [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Instance network_info: |[{"id": "2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560", "address": "fa:16:3e:f4:40:f4", "network": {"id": "8ffc4041-9c9c-4b0f-9342-dcd76540515a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-82115333-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "19f2a3034fe9458682e1242c91e2ce45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fd0b7cd-39", "ovs_interfaceid": "2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-207ded6b-c13a-4669-acd6-3721a72545ba req-03caa8c5-f47a-46d9-ba35-9932bb1cdd6b service nova] Acquired lock "refresh_cache-d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.network.neutron [req-207ded6b-c13a-4669-acd6-3721a72545ba req-03caa8c5-f47a-46d9-ba35-9932bb1cdd6b service nova] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Refreshing network info cache for port 2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560 {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Start _get_guest_xml network_info=[{"id": "2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560", "address": "fa:16:3e:f4:40:f4", "network": {"id": "8ffc4041-9c9c-4b0f-9342-dcd76540515a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-82115333-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "19f2a3034fe9458682e1242c91e2ce45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fd0b7cd-39", "ovs_interfaceid": "2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'image_id': '80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 22:17:37 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:17:37 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71972) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T22:06:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=), allow threads: True {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Flavor limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Image limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Flavor pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Image pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Got 1 possible topologies {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:17:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1878222400',display_name='tempest-AttachVolumeTestJSON-server-1878222400',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-1878222400',id=21,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKioJtk4DRUjErpS+dQkd6Zh3VD3Z5KPplVwI2N0nKmntp65lObvZvx08UYCyY2b8WcQy2GnzCG0FbEpNEq4UnZs1TTLDZZcBxjo7SevW/18tN2cTzyfQlmnoJ85kbLQJQ==',key_name='tempest-keypair-4218837',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='19f2a3034fe9458682e1242c91e2ce45',ramdisk_id='',reservation_id='r-5o6t1u5a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-4448958',owner_user_name='tempest-AttachVolumeTestJSON-4448958-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:17:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='30c12a4244db438ea682e545c378abe1',uuid=d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560", "address": "fa:16:3e:f4:40:f4", "network": {"id": "8ffc4041-9c9c-4b0f-9342-dcd76540515a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-82115333-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "19f2a3034fe9458682e1242c91e2ce45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fd0b7cd-39", "ovs_interfaceid": "2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71972) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Converting VIF {"id": "2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560", "address": "fa:16:3e:f4:40:f4", "network": {"id": "8ffc4041-9c9c-4b0f-9342-dcd76540515a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-82115333-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "19f2a3034fe9458682e1242c91e2ce45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fd0b7cd-39", "ovs_interfaceid": "2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:40:f4,bridge_name='br-int',has_traffic_filtering=True,id=2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560,network=Network(8ffc4041-9c9c-4b0f-9342-dcd76540515a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fd0b7cd-39') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.objects.instance [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lazy-loading 'pci_devices' on Instance uuid d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] End _get_guest_xml xml= Apr 17 22:17:37 user nova-compute[71972]: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0 Apr 17 22:17:37 user nova-compute[71972]: instance-00000015 Apr 17 22:17:37 user nova-compute[71972]: 131072 Apr 17 22:17:37 user nova-compute[71972]: 1 Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: tempest-AttachVolumeTestJSON-server-1878222400 Apr 17 22:17:37 user nova-compute[71972]: 2023-04-17 22:17:37 Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: 128 Apr 17 22:17:37 user nova-compute[71972]: 1 Apr 17 22:17:37 user nova-compute[71972]: 0 Apr 17 22:17:37 user nova-compute[71972]: 0 Apr 17 22:17:37 user nova-compute[71972]: 1 Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: tempest-AttachVolumeTestJSON-4448958-project-member Apr 17 22:17:37 user nova-compute[71972]: tempest-AttachVolumeTestJSON-4448958 Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: OpenStack Foundation Apr 17 22:17:37 user nova-compute[71972]: OpenStack Nova Apr 17 22:17:37 user nova-compute[71972]: 0.0.0 Apr 17 22:17:37 user nova-compute[71972]: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0 Apr 17 22:17:37 user nova-compute[71972]: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0 Apr 17 22:17:37 user nova-compute[71972]: Virtual Machine Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: hvm Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Nehalem Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: /dev/urandom Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: Apr 17 22:17:37 user nova-compute[71972]: {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:17:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1878222400',display_name='tempest-AttachVolumeTestJSON-server-1878222400',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-1878222400',id=21,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKioJtk4DRUjErpS+dQkd6Zh3VD3Z5KPplVwI2N0nKmntp65lObvZvx08UYCyY2b8WcQy2GnzCG0FbEpNEq4UnZs1TTLDZZcBxjo7SevW/18tN2cTzyfQlmnoJ85kbLQJQ==',key_name='tempest-keypair-4218837',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='19f2a3034fe9458682e1242c91e2ce45',ramdisk_id='',reservation_id='r-5o6t1u5a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-4448958',owner_user_name='tempest-AttachVolumeTestJSON-4448958-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:17:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='30c12a4244db438ea682e545c378abe1',uuid=d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560", "address": "fa:16:3e:f4:40:f4", "network": {"id": "8ffc4041-9c9c-4b0f-9342-dcd76540515a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-82115333-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "19f2a3034fe9458682e1242c91e2ce45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fd0b7cd-39", "ovs_interfaceid": "2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Converting VIF {"id": "2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560", "address": "fa:16:3e:f4:40:f4", "network": {"id": "8ffc4041-9c9c-4b0f-9342-dcd76540515a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-82115333-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "19f2a3034fe9458682e1242c91e2ce45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fd0b7cd-39", "ovs_interfaceid": "2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:40:f4,bridge_name='br-int',has_traffic_filtering=True,id=2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560,network=Network(8ffc4041-9c9c-4b0f-9342-dcd76540515a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fd0b7cd-39') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG os_vif [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:40:f4,bridge_name='br-int',has_traffic_filtering=True,id=2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560,network=Network(8ffc4041-9c9c-4b0f-9342-dcd76540515a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fd0b7cd-39') {{(pid=71972) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2fd0b7cd-39, may_exist=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2fd0b7cd-39, col_values=(('external_ids', {'iface-id': '2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f4:40:f4', 'vm-uuid': 'd4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0'}),)) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:37 user nova-compute[71972]: INFO os_vif [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:40:f4,bridge_name='br-int',has_traffic_filtering=True,id=2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560,network=Network(8ffc4041-9c9c-4b0f-9342-dcd76540515a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fd0b7cd-39') Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] No BDM found with device name vda, not building metadata. {{(pid=71972) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 22:17:37 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] No VIF found with MAC fa:16:3e:f4:40:f4, not building metadata {{(pid=71972) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 22:17:38 user nova-compute[71972]: DEBUG nova.network.neutron [req-207ded6b-c13a-4669-acd6-3721a72545ba req-03caa8c5-f47a-46d9-ba35-9932bb1cdd6b service nova] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Updated VIF entry in instance network info cache for port 2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:17:38 user nova-compute[71972]: DEBUG nova.network.neutron [req-207ded6b-c13a-4669-acd6-3721a72545ba req-03caa8c5-f47a-46d9-ba35-9932bb1cdd6b service nova] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Updating instance_info_cache with network_info: [{"id": "2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560", "address": "fa:16:3e:f4:40:f4", "network": {"id": "8ffc4041-9c9c-4b0f-9342-dcd76540515a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-82115333-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "19f2a3034fe9458682e1242c91e2ce45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fd0b7cd-39", "ovs_interfaceid": "2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:17:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-207ded6b-c13a-4669-acd6-3721a72545ba req-03caa8c5-f47a-46d9-ba35-9932bb1cdd6b service nova] Releasing lock "refresh_cache-d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:17:39 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:39 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:39 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:39 user nova-compute[71972]: DEBUG nova.compute.manager [req-c49dfa99-2025-4ee3-afec-63caffe76f90 req-e792bf31-8c58-4183-bc96-27a0cece7773 service nova] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Received event network-vif-plugged-2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:17:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-c49dfa99-2025-4ee3-afec-63caffe76f90 req-e792bf31-8c58-4183-bc96-27a0cece7773 service nova] Acquiring lock "d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:17:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-c49dfa99-2025-4ee3-afec-63caffe76f90 req-e792bf31-8c58-4183-bc96-27a0cece7773 service nova] Lock "d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:17:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-c49dfa99-2025-4ee3-afec-63caffe76f90 req-e792bf31-8c58-4183-bc96-27a0cece7773 service nova] Lock "d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:17:39 user nova-compute[71972]: DEBUG nova.compute.manager [req-c49dfa99-2025-4ee3-afec-63caffe76f90 req-e792bf31-8c58-4183-bc96-27a0cece7773 service nova] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] No waiting events found dispatching network-vif-plugged-2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:17:39 user nova-compute[71972]: WARNING nova.compute.manager [req-c49dfa99-2025-4ee3-afec-63caffe76f90 req-e792bf31-8c58-4183-bc96-27a0cece7773 service nova] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Received unexpected event network-vif-plugged-2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560 for instance with vm_state building and task_state spawning. Apr 17 22:17:39 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:17:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:17:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:17:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:17:39 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Auditing locally available compute resources for user (node: user) {{(pid=71972) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 22:17:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:17:39 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:39 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0/disk --force-share --output=json" returned: 0 in 0.170s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:17:39 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:17:40 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:17:40 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:17:40 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:17:40 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:17:40 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json" returned: 0 in 0.144s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:17:40 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:17:40 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json" returned: 0 in 0.237s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:17:40 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:17:40 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678/disk --force-share --output=json" returned: 0 in 0.155s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:17:40 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4c1a87de-805e-4878-bd87-56b203e510de/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:17:40 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4c1a87de-805e-4878-bd87-56b203e510de/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:17:40 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4c1a87de-805e-4878-bd87-56b203e510de/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:17:41 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4c1a87de-805e-4878-bd87-56b203e510de/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:17:41 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Resumed> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:17:41 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] VM Resumed (Lifecycle Event) Apr 17 22:17:41 user nova-compute[71972]: DEBUG nova.compute.manager [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Instance event wait completed in 0 seconds for {{(pid=71972) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 22:17:41 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Guest created on hypervisor {{(pid=71972) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 22:17:41 user nova-compute[71972]: DEBUG nova.compute.manager [req-d3c61702-8499-4f5d-a17a-46970010184b req-677a8af6-90f2-43d4-9a84-8bd93de9432c service nova] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Received event network-vif-plugged-2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:17:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-d3c61702-8499-4f5d-a17a-46970010184b req-677a8af6-90f2-43d4-9a84-8bd93de9432c service nova] Acquiring lock "d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:17:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-d3c61702-8499-4f5d-a17a-46970010184b req-677a8af6-90f2-43d4-9a84-8bd93de9432c service nova] Lock "d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:17:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-d3c61702-8499-4f5d-a17a-46970010184b req-677a8af6-90f2-43d4-9a84-8bd93de9432c service nova] Lock "d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:17:41 user nova-compute[71972]: DEBUG nova.compute.manager [req-d3c61702-8499-4f5d-a17a-46970010184b req-677a8af6-90f2-43d4-9a84-8bd93de9432c service nova] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] No waiting events found dispatching network-vif-plugged-2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:17:41 user nova-compute[71972]: WARNING nova.compute.manager [req-d3c61702-8499-4f5d-a17a-46970010184b req-677a8af6-90f2-43d4-9a84-8bd93de9432c service nova] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Received unexpected event network-vif-plugged-2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560 for instance with vm_state building and task_state spawning. Apr 17 22:17:41 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:17:41 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Instance spawned successfully. Apr 17 22:17:41 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:17:41 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:17:41 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Hypervisor/Node resource view: name=user free_ram=8617MB free_disk=26.56194305419922GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71972) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 22:17:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:17:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:17:41 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 22:17:41 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:17:41 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Found default for hw_cdrom_bus of ide {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:17:41 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Found default for hw_disk_bus of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:17:41 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Found default for hw_input_bus of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:17:41 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Found default for hw_pointer_model of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:17:41 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Found default for hw_video_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:17:41 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Found default for hw_vif_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:17:41 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:17:41 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Started> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:17:41 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] VM Started (Lifecycle Event) Apr 17 22:17:41 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:17:41 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:17:41 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:17:41 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 99cb7131-abb8-41d6-bddd-c3bc943b7678 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:17:41 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance ac56fdf0-5568-434b-ad61-805634c2beeb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:17:41 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 4c1a87de-805e-4878-bd87-56b203e510de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:17:41 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:17:41 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Total usable vcpus: 12, total allocated vcpus: 4 {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 22:17:41 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Final resource view: name=user phys_ram=16023MB used_ram=1024MB phys_disk=40GB used_disk=4GB total_vcpus=12 used_vcpus=4 pci_stats=[] {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 22:17:41 user nova-compute[71972]: INFO nova.compute.manager [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Took 6.18 seconds to spawn the instance on the hypervisor. Apr 17 22:17:41 user nova-compute[71972]: DEBUG nova.compute.manager [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:17:41 user nova-compute[71972]: INFO nova.compute.manager [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Took 6.79 seconds to build instance. Apr 17 22:17:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-9eccbcce-62f9-4fbf-b411-483e5e07358b tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.888s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:17:41 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:17:42 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:17:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Compute_service record updated for user:user {{(pid=71972) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 22:17:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.290s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:17:42 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:42 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:43 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:17:43 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:17:43 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:17:43 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71972) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 22:17:43 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:17:43 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Starting heal instance info cache {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 22:17:43 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Didn't find any instances for network info cache update. {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 17 22:17:43 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:17:45 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:17:45 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:17:45 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:17:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7095a93d-4689-44c8-9e40-2e3505924c8e tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Acquiring lock "99cb7131-abb8-41d6-bddd-c3bc943b7678" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:17:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7095a93d-4689-44c8-9e40-2e3505924c8e tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Lock "99cb7131-abb8-41d6-bddd-c3bc943b7678" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:17:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7095a93d-4689-44c8-9e40-2e3505924c8e tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Acquiring lock "99cb7131-abb8-41d6-bddd-c3bc943b7678-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:17:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7095a93d-4689-44c8-9e40-2e3505924c8e tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Lock "99cb7131-abb8-41d6-bddd-c3bc943b7678-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:17:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7095a93d-4689-44c8-9e40-2e3505924c8e tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Lock "99cb7131-abb8-41d6-bddd-c3bc943b7678-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:17:46 user nova-compute[71972]: INFO nova.compute.manager [None req-7095a93d-4689-44c8-9e40-2e3505924c8e tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Terminating instance Apr 17 22:17:46 user nova-compute[71972]: DEBUG nova.compute.manager [None req-7095a93d-4689-44c8-9e40-2e3505924c8e tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Start destroying the instance on the hypervisor. {{(pid=71972) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 22:17:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:46 user nova-compute[71972]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:17:46 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] VM Stopped (Lifecycle Event) Apr 17 22:17:46 user nova-compute[71972]: DEBUG nova.compute.manager [None req-7191c1ba-8434-46b7-ab74-fe22f7419d70 None None] [instance: 7bb0580b-727f-4168-9d56-56dcb4fa404e] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:17:47 user nova-compute[71972]: DEBUG nova.compute.manager [req-3244033a-5219-4fd1-95db-9ef4eeed6317 req-a5e4a470-313a-479b-9c0b-d4d773be57e5 service nova] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Received event network-vif-unplugged-aede8066-45b3-4414-98a0-50dda5a4ee66 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:17:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-3244033a-5219-4fd1-95db-9ef4eeed6317 req-a5e4a470-313a-479b-9c0b-d4d773be57e5 service nova] Acquiring lock "99cb7131-abb8-41d6-bddd-c3bc943b7678-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:17:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-3244033a-5219-4fd1-95db-9ef4eeed6317 req-a5e4a470-313a-479b-9c0b-d4d773be57e5 service nova] Lock "99cb7131-abb8-41d6-bddd-c3bc943b7678-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:17:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-3244033a-5219-4fd1-95db-9ef4eeed6317 req-a5e4a470-313a-479b-9c0b-d4d773be57e5 service nova] Lock "99cb7131-abb8-41d6-bddd-c3bc943b7678-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:17:47 user nova-compute[71972]: DEBUG nova.compute.manager [req-3244033a-5219-4fd1-95db-9ef4eeed6317 req-a5e4a470-313a-479b-9c0b-d4d773be57e5 service nova] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] No waiting events found dispatching network-vif-unplugged-aede8066-45b3-4414-98a0-50dda5a4ee66 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:17:47 user nova-compute[71972]: DEBUG nova.compute.manager [req-3244033a-5219-4fd1-95db-9ef4eeed6317 req-a5e4a470-313a-479b-9c0b-d4d773be57e5 service nova] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Received event network-vif-unplugged-aede8066-45b3-4414-98a0-50dda5a4ee66 for instance with task_state deleting. {{(pid=71972) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 22:17:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:47 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Instance destroyed successfully. Apr 17 22:17:47 user nova-compute[71972]: DEBUG nova.objects.instance [None req-7095a93d-4689-44c8-9e40-2e3505924c8e tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Lazy-loading 'resources' on Instance uuid 99cb7131-abb8-41d6-bddd-c3bc943b7678 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:17:47 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-7095a93d-4689-44c8-9e40-2e3505924c8e tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:08:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-2094342662',display_name='tempest-ServerActionsTestJSON-server-2094342662',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serveractionstestjson-server-2094342662',id=3,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMIJM6dCvYLXS8VGI2L2wG2Wl5w2/gW9spGN2iclkhvyAOdgyxFfXQNjna17ZfeIKWKVS3RLrdPtDvd/wHntS9qVvu9iFpd3o+fH4gutMbeRa70JnJhZgkKJB4XiFchpJA==',key_name='tempest-keypair-1204015820',keypairs=,launch_index=0,launched_at=2023-04-17T22:09:14Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='b99089f4e3074ee7a5c1ada03ceb8984',ramdisk_id='',reservation_id='r-1222ez6u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerActionsTestJSON-1322112249',owner_user_name='tempest-ServerActionsTestJSON-1322112249-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T22:09:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9be63d1d20854fa28375599715a5ba74',uuid=99cb7131-abb8-41d6-bddd-c3bc943b7678,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "aede8066-45b3-4414-98a0-50dda5a4ee66", "address": "fa:16:3e:be:3c:02", "network": {"id": "966f14e1-bfad-4b86-921c-0f8f5ad29a5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1196755439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.137", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b99089f4e3074ee7a5c1ada03ceb8984", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaede8066-45", "ovs_interfaceid": "aede8066-45b3-4414-98a0-50dda5a4ee66", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 22:17:47 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-7095a93d-4689-44c8-9e40-2e3505924c8e tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Converting VIF {"id": "aede8066-45b3-4414-98a0-50dda5a4ee66", "address": "fa:16:3e:be:3c:02", "network": {"id": "966f14e1-bfad-4b86-921c-0f8f5ad29a5a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-1196755439-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.137", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b99089f4e3074ee7a5c1ada03ceb8984", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapaede8066-45", "ovs_interfaceid": "aede8066-45b3-4414-98a0-50dda5a4ee66", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:17:47 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-7095a93d-4689-44c8-9e40-2e3505924c8e tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:be:3c:02,bridge_name='br-int',has_traffic_filtering=True,id=aede8066-45b3-4414-98a0-50dda5a4ee66,network=Network(966f14e1-bfad-4b86-921c-0f8f5ad29a5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaede8066-45') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:17:47 user nova-compute[71972]: DEBUG os_vif [None req-7095a93d-4689-44c8-9e40-2e3505924c8e tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:be:3c:02,bridge_name='br-int',has_traffic_filtering=True,id=aede8066-45b3-4414-98a0-50dda5a4ee66,network=Network(966f14e1-bfad-4b86-921c-0f8f5ad29a5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaede8066-45') {{(pid=71972) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 22:17:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaede8066-45, bridge=br-int, if_exists=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:17:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:47 user nova-compute[71972]: INFO os_vif [None req-7095a93d-4689-44c8-9e40-2e3505924c8e tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:be:3c:02,bridge_name='br-int',has_traffic_filtering=True,id=aede8066-45b3-4414-98a0-50dda5a4ee66,network=Network(966f14e1-bfad-4b86-921c-0f8f5ad29a5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaede8066-45') Apr 17 22:17:47 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-7095a93d-4689-44c8-9e40-2e3505924c8e tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Deleting instance files /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678_del Apr 17 22:17:47 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-7095a93d-4689-44c8-9e40-2e3505924c8e tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Deletion of /opt/stack/data/nova/instances/99cb7131-abb8-41d6-bddd-c3bc943b7678_del complete Apr 17 22:17:47 user nova-compute[71972]: INFO nova.compute.manager [None req-7095a93d-4689-44c8-9e40-2e3505924c8e tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Took 0.87 seconds to destroy the instance on the hypervisor. Apr 17 22:17:47 user nova-compute[71972]: DEBUG oslo.service.loopingcall [None req-7095a93d-4689-44c8-9e40-2e3505924c8e tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71972) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 22:17:47 user nova-compute[71972]: DEBUG nova.compute.manager [-] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Deallocating network for instance {{(pid=71972) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 22:17:47 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] deallocate_for_instance() {{(pid=71972) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 22:17:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:48 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:17:48 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Took 0.87 seconds to deallocate network for instance. Apr 17 22:17:48 user nova-compute[71972]: DEBUG nova.compute.manager [req-81d5ef3a-e7a4-4e73-8c05-dbcf450b0492 req-2f42a8ab-2ae4-4370-9ee6-9b43d17fffd5 service nova] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Received event network-vif-deleted-aede8066-45b3-4414-98a0-50dda5a4ee66 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:17:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7095a93d-4689-44c8-9e40-2e3505924c8e tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:17:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7095a93d-4689-44c8-9e40-2e3505924c8e tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:17:48 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-7095a93d-4689-44c8-9e40-2e3505924c8e tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:17:48 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-7095a93d-4689-44c8-9e40-2e3505924c8e tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:17:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7095a93d-4689-44c8-9e40-2e3505924c8e tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.168s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:17:48 user nova-compute[71972]: INFO nova.scheduler.client.report [None req-7095a93d-4689-44c8-9e40-2e3505924c8e tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Deleted allocations for instance 99cb7131-abb8-41d6-bddd-c3bc943b7678 Apr 17 22:17:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7095a93d-4689-44c8-9e40-2e3505924c8e tempest-ServerActionsTestJSON-1322112249 tempest-ServerActionsTestJSON-1322112249-project-member] Lock "99cb7131-abb8-41d6-bddd-c3bc943b7678" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.073s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:17:49 user nova-compute[71972]: DEBUG nova.compute.manager [req-343ff9ec-c56f-4def-9eba-65be0de98706 req-50c4910d-9ba3-488b-9599-2a161efad0b2 service nova] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Received event network-vif-plugged-aede8066-45b3-4414-98a0-50dda5a4ee66 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:17:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-343ff9ec-c56f-4def-9eba-65be0de98706 req-50c4910d-9ba3-488b-9599-2a161efad0b2 service nova] Acquiring lock "99cb7131-abb8-41d6-bddd-c3bc943b7678-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:17:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-343ff9ec-c56f-4def-9eba-65be0de98706 req-50c4910d-9ba3-488b-9599-2a161efad0b2 service nova] Lock "99cb7131-abb8-41d6-bddd-c3bc943b7678-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:17:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-343ff9ec-c56f-4def-9eba-65be0de98706 req-50c4910d-9ba3-488b-9599-2a161efad0b2 service nova] Lock "99cb7131-abb8-41d6-bddd-c3bc943b7678-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:17:49 user nova-compute[71972]: DEBUG nova.compute.manager [req-343ff9ec-c56f-4def-9eba-65be0de98706 req-50c4910d-9ba3-488b-9599-2a161efad0b2 service nova] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] No waiting events found dispatching network-vif-plugged-aede8066-45b3-4414-98a0-50dda5a4ee66 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:17:49 user nova-compute[71972]: WARNING nova.compute.manager [req-343ff9ec-c56f-4def-9eba-65be0de98706 req-50c4910d-9ba3-488b-9599-2a161efad0b2 service nova] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Received unexpected event network-vif-plugged-aede8066-45b3-4414-98a0-50dda5a4ee66 for instance with vm_state deleted and task_state None. Apr 17 22:17:52 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:57 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:17:57 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:18:02 user nova-compute[71972]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:18:02 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] VM Stopped (Lifecycle Event) Apr 17 22:18:02 user nova-compute[71972]: DEBUG nova.compute.manager [None req-405a4a48-9e9c-4b75-80e4-3dcf9488d152 None None] [instance: 99cb7131-abb8-41d6-bddd-c3bc943b7678] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:18:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:18:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:18:07 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:18:07 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:18:12 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:18:17 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:18:17 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:18:22 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:18:22 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:18:23 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:18:27 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:18:30 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:18:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:18:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquiring lock "fde4edce-4669-488e-80e5-6ee0029b19d2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:18:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "fde4edce-4669-488e-80e5-6ee0029b19d2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:18:36 user nova-compute[71972]: DEBUG nova.compute.manager [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Starting instance... {{(pid=71972) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 22:18:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:18:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:18:36 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71972) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 22:18:36 user nova-compute[71972]: INFO nova.compute.claims [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Claim successful on node user Apr 17 22:18:36 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:18:36 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:18:36 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.239s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:18:36 user nova-compute[71972]: DEBUG nova.compute.manager [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Start building networks asynchronously for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 22:18:36 user nova-compute[71972]: DEBUG nova.compute.manager [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Allocating IP information in the background. {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 22:18:36 user nova-compute[71972]: DEBUG nova.network.neutron [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] allocate_for_instance() {{(pid=71972) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 22:18:36 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 22:18:36 user nova-compute[71972]: DEBUG nova.compute.manager [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Start building block device mappings for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 22:18:36 user nova-compute[71972]: INFO nova.virt.block_device [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Booting with blank volume at /dev/vda Apr 17 22:18:36 user nova-compute[71972]: DEBUG nova.policy [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6df5551ab4974747a0412ce089b770b0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '26967ac9e8cb45b6aea04a699d4a1eca', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71972) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 22:18:36 user nova-compute[71972]: WARNING nova.compute.manager [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Volume id: a13b6586-04db-4345-9436-2d1d8803916f finished being created but its status is error. Apr 17 22:18:36 user nova-compute[71972]: ERROR nova.compute.manager [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Instance failed block device setup: nova.exception.VolumeNotCreated: Volume a13b6586-04db-4345-9436-2d1d8803916f did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 17 22:18:36 user nova-compute[71972]: ERROR nova.compute.manager [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Traceback (most recent call last): Apr 17 22:18:36 user nova-compute[71972]: ERROR nova.compute.manager [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] File "/opt/stack/nova/nova/compute/manager.py", line 2175, in _prep_block_device Apr 17 22:18:36 user nova-compute[71972]: ERROR nova.compute.manager [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] driver_block_device.attach_block_devices( Apr 17 22:18:36 user nova-compute[71972]: ERROR nova.compute.manager [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] File "/opt/stack/nova/nova/virt/block_device.py", line 936, in attach_block_devices Apr 17 22:18:36 user nova-compute[71972]: ERROR nova.compute.manager [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] _log_and_attach(device) Apr 17 22:18:36 user nova-compute[71972]: ERROR nova.compute.manager [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] File "/opt/stack/nova/nova/virt/block_device.py", line 933, in _log_and_attach Apr 17 22:18:36 user nova-compute[71972]: ERROR nova.compute.manager [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] bdm.attach(*attach_args, **attach_kwargs) Apr 17 22:18:36 user nova-compute[71972]: ERROR nova.compute.manager [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] File "/opt/stack/nova/nova/virt/block_device.py", line 848, in attach Apr 17 22:18:36 user nova-compute[71972]: ERROR nova.compute.manager [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] self.volume_id, self.attachment_id = self._create_volume( Apr 17 22:18:36 user nova-compute[71972]: ERROR nova.compute.manager [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] File "/opt/stack/nova/nova/virt/block_device.py", line 435, in _create_volume Apr 17 22:18:36 user nova-compute[71972]: ERROR nova.compute.manager [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] self._call_wait_func(context, wait_func, volume_api, vol['id']) Apr 17 22:18:36 user nova-compute[71972]: ERROR nova.compute.manager [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] File "/opt/stack/nova/nova/virt/block_device.py", line 785, in _call_wait_func Apr 17 22:18:36 user nova-compute[71972]: ERROR nova.compute.manager [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] with excutils.save_and_reraise_exception(): Apr 17 22:18:36 user nova-compute[71972]: ERROR nova.compute.manager [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ Apr 17 22:18:36 user nova-compute[71972]: ERROR nova.compute.manager [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] self.force_reraise() Apr 17 22:18:36 user nova-compute[71972]: ERROR nova.compute.manager [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise Apr 17 22:18:36 user nova-compute[71972]: ERROR nova.compute.manager [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] raise self.value Apr 17 22:18:36 user nova-compute[71972]: ERROR nova.compute.manager [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] File "/opt/stack/nova/nova/virt/block_device.py", line 783, in _call_wait_func Apr 17 22:18:36 user nova-compute[71972]: ERROR nova.compute.manager [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] wait_func(context, volume_id) Apr 17 22:18:36 user nova-compute[71972]: ERROR nova.compute.manager [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] File "/opt/stack/nova/nova/compute/manager.py", line 1792, in _await_block_device_map_created Apr 17 22:18:36 user nova-compute[71972]: ERROR nova.compute.manager [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] raise exception.VolumeNotCreated(volume_id=vol_id, Apr 17 22:18:36 user nova-compute[71972]: ERROR nova.compute.manager [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] nova.exception.VolumeNotCreated: Volume a13b6586-04db-4345-9436-2d1d8803916f did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 17 22:18:36 user nova-compute[71972]: ERROR nova.compute.manager [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Apr 17 22:18:37 user nova-compute[71972]: DEBUG nova.network.neutron [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Successfully created port: e1e1efd3-4ae0-4211-ab64-685331ae2ffe {{(pid=71972) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 22:18:37 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG nova.network.neutron [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Successfully updated port: e1e1efd3-4ae0-4211-ab64-685331ae2ffe {{(pid=71972) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquiring lock "refresh_cache-fde4edce-4669-488e-80e5-6ee0029b19d2" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquired lock "refresh_cache-fde4edce-4669-488e-80e5-6ee0029b19d2" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG nova.network.neutron [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Building network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG nova.compute.manager [req-a4d92633-393e-405f-a5b4-cb5d8cad50de req-4b2e3f13-ffb0-415a-90a0-8698de88b9ad service nova] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Received event network-changed-e1e1efd3-4ae0-4211-ab64-685331ae2ffe {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG nova.compute.manager [req-a4d92633-393e-405f-a5b4-cb5d8cad50de req-4b2e3f13-ffb0-415a-90a0-8698de88b9ad service nova] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Refreshing instance network info cache due to event network-changed-e1e1efd3-4ae0-4211-ab64-685331ae2ffe. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-a4d92633-393e-405f-a5b4-cb5d8cad50de req-4b2e3f13-ffb0-415a-90a0-8698de88b9ad service nova] Acquiring lock "refresh_cache-fde4edce-4669-488e-80e5-6ee0029b19d2" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG nova.network.neutron [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Instance cache missing network info. {{(pid=71972) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG nova.network.neutron [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Updating instance_info_cache with network_info: [{"id": "e1e1efd3-4ae0-4211-ab64-685331ae2ffe", "address": "fa:16:3e:68:42:01", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape1e1efd3-4a", "ovs_interfaceid": "e1e1efd3-4ae0-4211-ab64-685331ae2ffe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Releasing lock "refresh_cache-fde4edce-4669-488e-80e5-6ee0029b19d2" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG nova.compute.manager [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Instance network_info: |[{"id": "e1e1efd3-4ae0-4211-ab64-685331ae2ffe", "address": "fa:16:3e:68:42:01", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape1e1efd3-4a", "ovs_interfaceid": "e1e1efd3-4ae0-4211-ab64-685331ae2ffe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-a4d92633-393e-405f-a5b4-cb5d8cad50de req-4b2e3f13-ffb0-415a-90a0-8698de88b9ad service nova] Acquired lock "refresh_cache-fde4edce-4669-488e-80e5-6ee0029b19d2" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG nova.network.neutron [req-a4d92633-393e-405f-a5b4-cb5d8cad50de req-4b2e3f13-ffb0-415a-90a0-8698de88b9ad service nova] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Refreshing network info cache for port e1e1efd3-4ae0-4211-ab64-685331ae2ffe {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG nova.compute.claims [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Aborting claim: {{(pid=71972) abort /opt/stack/nova/nova/compute/claims.py:84}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.248s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG nova.compute.manager [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Build of instance fde4edce-4669-488e-80e5-6ee0029b19d2 aborted: Volume a13b6586-04db-4345-9436-2d1d8803916f did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. {{(pid=71972) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2636}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG nova.compute.utils [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Build of instance fde4edce-4669-488e-80e5-6ee0029b19d2 aborted: Volume a13b6586-04db-4345-9436-2d1d8803916f did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. {{(pid=71972) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} Apr 17 22:18:38 user nova-compute[71972]: ERROR nova.compute.manager [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Build of instance fde4edce-4669-488e-80e5-6ee0029b19d2 aborted: Volume a13b6586-04db-4345-9436-2d1d8803916f did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error.: nova.exception.BuildAbortException: Build of instance fde4edce-4669-488e-80e5-6ee0029b19d2 aborted: Volume a13b6586-04db-4345-9436-2d1d8803916f did not finish being created even after we waited 0 seconds or 1 attempts. And its status is error. Apr 17 22:18:38 user nova-compute[71972]: DEBUG nova.compute.manager [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Unplugging VIFs for instance {{(pid=71972) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:18:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1237221251',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1237221251',id=22,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='26967ac9e8cb45b6aea04a699d4a1eca',ramdisk_id='',reservation_id='r-jvp9m1k9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-480550513',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member'},tags=TagList,task_state='block_device_mapping',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:18:36Z,user_data=None,user_id='6df5551ab4974747a0412ce089b770b0',uuid=fde4edce-4669-488e-80e5-6ee0029b19d2,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e1e1efd3-4ae0-4211-ab64-685331ae2ffe", "address": "fa:16:3e:68:42:01", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape1e1efd3-4a", "ovs_interfaceid": "e1e1efd3-4ae0-4211-ab64-685331ae2ffe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Converting VIF {"id": "e1e1efd3-4ae0-4211-ab64-685331ae2ffe", "address": "fa:16:3e:68:42:01", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape1e1efd3-4a", "ovs_interfaceid": "e1e1efd3-4ae0-4211-ab64-685331ae2ffe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:68:42:01,bridge_name='br-int',has_traffic_filtering=True,id=e1e1efd3-4ae0-4211-ab64-685331ae2ffe,network=Network(bf926b5c-baa2-457d-aab9-e2deea0b84c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1e1efd3-4a') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG os_vif [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:42:01,bridge_name='br-int',has_traffic_filtering=True,id=e1e1efd3-4ae0-4211-ab64-685331ae2ffe,network=Network(bf926b5c-baa2-457d-aab9-e2deea0b84c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1e1efd3-4a') {{(pid=71972) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape1e1efd3-4a, bridge=br-int, if_exists=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 22:18:38 user nova-compute[71972]: INFO os_vif [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:68:42:01,bridge_name='br-int',has_traffic_filtering=True,id=e1e1efd3-4ae0-4211-ab64-685331ae2ffe,network=Network(bf926b5c-baa2-457d-aab9-e2deea0b84c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1e1efd3-4a') Apr 17 22:18:38 user nova-compute[71972]: DEBUG nova.compute.manager [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Unplugged VIFs for instance {{(pid=71972) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG nova.compute.manager [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Deallocating network for instance {{(pid=71972) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG nova.network.neutron [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] deallocate_for_instance() {{(pid=71972) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG nova.network.neutron [req-a4d92633-393e-405f-a5b4-cb5d8cad50de req-4b2e3f13-ffb0-415a-90a0-8698de88b9ad service nova] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Updated VIF entry in instance network info cache for port e1e1efd3-4ae0-4211-ab64-685331ae2ffe. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG nova.network.neutron [req-a4d92633-393e-405f-a5b4-cb5d8cad50de req-4b2e3f13-ffb0-415a-90a0-8698de88b9ad service nova] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Updating instance_info_cache with network_info: [{"id": "e1e1efd3-4ae0-4211-ab64-685331ae2ffe", "address": "fa:16:3e:68:42:01", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tape1e1efd3-4a", "ovs_interfaceid": "e1e1efd3-4ae0-4211-ab64-685331ae2ffe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:18:38 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-a4d92633-393e-405f-a5b4-cb5d8cad50de req-4b2e3f13-ffb0-415a-90a0-8698de88b9ad service nova] Releasing lock "refresh_cache-fde4edce-4669-488e-80e5-6ee0029b19d2" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:18:39 user nova-compute[71972]: DEBUG nova.network.neutron [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:18:39 user nova-compute[71972]: INFO nova.compute.manager [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: fde4edce-4669-488e-80e5-6ee0029b19d2] Took 0.67 seconds to deallocate network for instance. Apr 17 22:18:39 user nova-compute[71972]: INFO nova.scheduler.client.report [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Deleted allocations for instance fde4edce-4669-488e-80e5-6ee0029b19d2 Apr 17 22:18:39 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-83e68f6e-1d00-4f68-a536-0eb41847d402 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "fde4edce-4669-488e-80e5-6ee0029b19d2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 3.537s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:18:39 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:18:40 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:18:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:18:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:18:40 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:18:40 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Auditing locally available compute resources for user (node: user) {{(pid=71972) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 22:18:40 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:18:40 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0/disk --force-share --output=json" returned: 0 in 0.184s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:18:40 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:18:41 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0/disk --force-share --output=json" returned: 0 in 0.153s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:18:41 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:18:41 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json" returned: 0 in 0.150s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:18:41 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:18:41 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:18:41 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4c1a87de-805e-4878-bd87-56b203e510de/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:18:41 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4c1a87de-805e-4878-bd87-56b203e510de/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:18:41 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4c1a87de-805e-4878-bd87-56b203e510de/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:18:41 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/4c1a87de-805e-4878-bd87-56b203e510de/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:18:42 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:18:42 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:18:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Hypervisor/Node resource view: name=user free_ram=8765MB free_disk=26.534923553466797GB free_vcpus=9 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71972) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 22:18:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:18:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:18:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance ac56fdf0-5568-434b-ad61-805634c2beeb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:18:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 4c1a87de-805e-4878-bd87-56b203e510de actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:18:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:18:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Total usable vcpus: 12, total allocated vcpus: 3 {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 22:18:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Final resource view: name=user phys_ram=16023MB used_ram=896MB phys_disk=40GB used_disk=3GB total_vcpus=12 used_vcpus=3 pci_stats=[] {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 22:18:42 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:18:42 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:18:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Compute_service record updated for user:user {{(pid=71972) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 22:18:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.261s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:18:42 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:18:44 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:18:44 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Starting heal instance info cache {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 22:18:44 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Rebuilding the list of instances to heal {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 17 22:18:44 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "refresh_cache-ac56fdf0-5568-434b-ad61-805634c2beeb" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:18:44 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquired lock "refresh_cache-ac56fdf0-5568-434b-ad61-805634c2beeb" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:18:44 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Forcefully refreshing network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 17 22:18:44 user nova-compute[71972]: DEBUG nova.objects.instance [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lazy-loading 'info_cache' on Instance uuid ac56fdf0-5568-434b-ad61-805634c2beeb {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:18:44 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Updating instance_info_cache with network_info: [{"id": "4acd5cd6-47e1-4450-8baa-092f9444c03f", "address": "fa:16:3e:5e:2a:3b", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4acd5cd6-47", "ovs_interfaceid": "4acd5cd6-47e1-4450-8baa-092f9444c03f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:18:44 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Releasing lock "refresh_cache-ac56fdf0-5568-434b-ad61-805634c2beeb" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:18:44 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Updated the network info_cache for instance {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 17 22:18:44 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:18:44 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:18:44 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:18:44 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71972) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 22:18:45 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:18:45 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:18:45 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:18:46 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:18:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:18:47 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:18:52 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:18:54 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-1b864649-c99e-4d80-b2b7-49b722722d66 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Acquiring lock "4c1a87de-805e-4878-bd87-56b203e510de" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:18:54 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-1b864649-c99e-4d80-b2b7-49b722722d66 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Lock "4c1a87de-805e-4878-bd87-56b203e510de" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:18:54 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-1b864649-c99e-4d80-b2b7-49b722722d66 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Acquiring lock "4c1a87de-805e-4878-bd87-56b203e510de-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:18:54 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-1b864649-c99e-4d80-b2b7-49b722722d66 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Lock "4c1a87de-805e-4878-bd87-56b203e510de-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:18:54 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-1b864649-c99e-4d80-b2b7-49b722722d66 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Lock "4c1a87de-805e-4878-bd87-56b203e510de-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:18:54 user nova-compute[71972]: INFO nova.compute.manager [None req-1b864649-c99e-4d80-b2b7-49b722722d66 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Terminating instance Apr 17 22:18:54 user nova-compute[71972]: DEBUG nova.compute.manager [None req-1b864649-c99e-4d80-b2b7-49b722722d66 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Start destroying the instance on the hypervisor. {{(pid=71972) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 22:18:54 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:18:54 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:18:54 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:18:54 user nova-compute[71972]: DEBUG nova.compute.manager [req-e17ebaac-d3b5-4f94-8e4f-f9c81e0414af req-22517b51-f384-43c4-93e1-36af136d579d service nova] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Received event network-vif-unplugged-7d86402d-0aa3-49a9-9e27-8b623fd9b33a {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:18:54 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-e17ebaac-d3b5-4f94-8e4f-f9c81e0414af req-22517b51-f384-43c4-93e1-36af136d579d service nova] Acquiring lock "4c1a87de-805e-4878-bd87-56b203e510de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:18:54 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-e17ebaac-d3b5-4f94-8e4f-f9c81e0414af req-22517b51-f384-43c4-93e1-36af136d579d service nova] Lock "4c1a87de-805e-4878-bd87-56b203e510de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:18:54 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-e17ebaac-d3b5-4f94-8e4f-f9c81e0414af req-22517b51-f384-43c4-93e1-36af136d579d service nova] Lock "4c1a87de-805e-4878-bd87-56b203e510de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:18:54 user nova-compute[71972]: DEBUG nova.compute.manager [req-e17ebaac-d3b5-4f94-8e4f-f9c81e0414af req-22517b51-f384-43c4-93e1-36af136d579d service nova] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] No waiting events found dispatching network-vif-unplugged-7d86402d-0aa3-49a9-9e27-8b623fd9b33a {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:18:54 user nova-compute[71972]: DEBUG nova.compute.manager [req-e17ebaac-d3b5-4f94-8e4f-f9c81e0414af req-22517b51-f384-43c4-93e1-36af136d579d service nova] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Received event network-vif-unplugged-7d86402d-0aa3-49a9-9e27-8b623fd9b33a for instance with task_state deleting. {{(pid=71972) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 22:18:54 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Instance destroyed successfully. Apr 17 22:18:54 user nova-compute[71972]: DEBUG nova.objects.instance [None req-1b864649-c99e-4d80-b2b7-49b722722d66 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Lazy-loading 'resources' on Instance uuid 4c1a87de-805e-4878-bd87-56b203e510de {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:18:54 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-1b864649-c99e-4d80-b2b7-49b722722d66 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:17:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-SnapshotDataIntegrityTests-server-308754079',display_name='tempest-SnapshotDataIntegrityTests-server-308754079',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-snapshotdataintegritytests-server-308754079',id=20,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFOG3Ac3W7UP7GHDYmWNPhQ0orTfJAryPIfYEeiXGqep9uRC0B5AhMF07UhTVg0hHnzoWVpOM3/yTD0/ALjSUcHNKu/VnX9AxdKs6qx6/qDtrLYFsp1gM6fgmpRc33N0FQ==',key_name='tempest-SnapshotDataIntegrityTests-1740085900',keypairs=,launch_index=0,launched_at=2023-04-17T22:17:08Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='b4f971b7fb95478e83cf0bc9f0eb92d5',ramdisk_id='',reservation_id='r-k9pofijj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-SnapshotDataIntegrityTests-1966210151',owner_user_name='tempest-SnapshotDataIntegrityTests-1966210151-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T22:17:08Z,user_data=None,user_id='daf86cb2718140b09cbc79de9e54efb6',uuid=4c1a87de-805e-4878-bd87-56b203e510de,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7d86402d-0aa3-49a9-9e27-8b623fd9b33a", "address": "fa:16:3e:74:a3:6e", "network": {"id": "5a0defe9-b217-4eee-8448-70ad161a6de1", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-10075044-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b4f971b7fb95478e83cf0bc9f0eb92d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d86402d-0a", "ovs_interfaceid": "7d86402d-0aa3-49a9-9e27-8b623fd9b33a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 22:18:54 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-1b864649-c99e-4d80-b2b7-49b722722d66 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Converting VIF {"id": "7d86402d-0aa3-49a9-9e27-8b623fd9b33a", "address": "fa:16:3e:74:a3:6e", "network": {"id": "5a0defe9-b217-4eee-8448-70ad161a6de1", "bridge": "br-int", "label": "tempest-SnapshotDataIntegrityTests-10075044-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "b4f971b7fb95478e83cf0bc9f0eb92d5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap7d86402d-0a", "ovs_interfaceid": "7d86402d-0aa3-49a9-9e27-8b623fd9b33a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:18:54 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-1b864649-c99e-4d80-b2b7-49b722722d66 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:a3:6e,bridge_name='br-int',has_traffic_filtering=True,id=7d86402d-0aa3-49a9-9e27-8b623fd9b33a,network=Network(5a0defe9-b217-4eee-8448-70ad161a6de1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d86402d-0a') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:18:54 user nova-compute[71972]: DEBUG os_vif [None req-1b864649-c99e-4d80-b2b7-49b722722d66 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:a3:6e,bridge_name='br-int',has_traffic_filtering=True,id=7d86402d-0aa3-49a9-9e27-8b623fd9b33a,network=Network(5a0defe9-b217-4eee-8448-70ad161a6de1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d86402d-0a') {{(pid=71972) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 22:18:54 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:18:54 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d86402d-0a, bridge=br-int, if_exists=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:18:54 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:18:54 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:18:54 user nova-compute[71972]: INFO os_vif [None req-1b864649-c99e-4d80-b2b7-49b722722d66 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:a3:6e,bridge_name='br-int',has_traffic_filtering=True,id=7d86402d-0aa3-49a9-9e27-8b623fd9b33a,network=Network(5a0defe9-b217-4eee-8448-70ad161a6de1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7d86402d-0a') Apr 17 22:18:54 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-1b864649-c99e-4d80-b2b7-49b722722d66 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Deleting instance files /opt/stack/data/nova/instances/4c1a87de-805e-4878-bd87-56b203e510de_del Apr 17 22:18:54 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-1b864649-c99e-4d80-b2b7-49b722722d66 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Deletion of /opt/stack/data/nova/instances/4c1a87de-805e-4878-bd87-56b203e510de_del complete Apr 17 22:18:54 user nova-compute[71972]: INFO nova.compute.manager [None req-1b864649-c99e-4d80-b2b7-49b722722d66 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Took 0.63 seconds to destroy the instance on the hypervisor. Apr 17 22:18:54 user nova-compute[71972]: DEBUG oslo.service.loopingcall [None req-1b864649-c99e-4d80-b2b7-49b722722d66 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71972) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 22:18:54 user nova-compute[71972]: DEBUG nova.compute.manager [-] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Deallocating network for instance {{(pid=71972) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 22:18:54 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] deallocate_for_instance() {{(pid=71972) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 22:18:55 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:18:55 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Took 0.65 seconds to deallocate network for instance. Apr 17 22:18:55 user nova-compute[71972]: DEBUG nova.compute.manager [req-66fe6164-5db0-43c5-8d0e-d675a681951b req-210498d7-7f7e-49c9-beb0-5fbc521666f8 service nova] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Received event network-vif-deleted-7d86402d-0aa3-49a9-9e27-8b623fd9b33a {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:18:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-1b864649-c99e-4d80-b2b7-49b722722d66 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:18:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-1b864649-c99e-4d80-b2b7-49b722722d66 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:18:55 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-1b864649-c99e-4d80-b2b7-49b722722d66 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:18:55 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-1b864649-c99e-4d80-b2b7-49b722722d66 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:18:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-1b864649-c99e-4d80-b2b7-49b722722d66 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.150s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:18:55 user nova-compute[71972]: INFO nova.scheduler.client.report [None req-1b864649-c99e-4d80-b2b7-49b722722d66 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Deleted allocations for instance 4c1a87de-805e-4878-bd87-56b203e510de Apr 17 22:18:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-1b864649-c99e-4d80-b2b7-49b722722d66 tempest-SnapshotDataIntegrityTests-1966210151 tempest-SnapshotDataIntegrityTests-1966210151-project-member] Lock "4c1a87de-805e-4878-bd87-56b203e510de" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.602s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:18:56 user nova-compute[71972]: DEBUG nova.compute.manager [req-5401c902-eefb-466c-a4e7-1d8983b8c616 req-23307f38-b563-4047-9010-8ddd866d773e service nova] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Received event network-vif-plugged-7d86402d-0aa3-49a9-9e27-8b623fd9b33a {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:18:56 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-5401c902-eefb-466c-a4e7-1d8983b8c616 req-23307f38-b563-4047-9010-8ddd866d773e service nova] Acquiring lock "4c1a87de-805e-4878-bd87-56b203e510de-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:18:56 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-5401c902-eefb-466c-a4e7-1d8983b8c616 req-23307f38-b563-4047-9010-8ddd866d773e service nova] Lock "4c1a87de-805e-4878-bd87-56b203e510de-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:18:56 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-5401c902-eefb-466c-a4e7-1d8983b8c616 req-23307f38-b563-4047-9010-8ddd866d773e service nova] Lock "4c1a87de-805e-4878-bd87-56b203e510de-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:18:56 user nova-compute[71972]: DEBUG nova.compute.manager [req-5401c902-eefb-466c-a4e7-1d8983b8c616 req-23307f38-b563-4047-9010-8ddd866d773e service nova] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] No waiting events found dispatching network-vif-plugged-7d86402d-0aa3-49a9-9e27-8b623fd9b33a {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:18:56 user nova-compute[71972]: WARNING nova.compute.manager [req-5401c902-eefb-466c-a4e7-1d8983b8c616 req-23307f38-b563-4047-9010-8ddd866d773e service nova] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Received unexpected event network-vif-plugged-7d86402d-0aa3-49a9-9e27-8b623fd9b33a for instance with vm_state deleted and task_state None. Apr 17 22:18:57 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:18:59 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:04 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:19:09 user nova-compute[71972]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:19:09 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] VM Stopped (Lifecycle Event) Apr 17 22:19:09 user nova-compute[71972]: DEBUG nova.compute.manager [None req-3d0485fe-321a-4754-a405-df7d5a732bcd None None] [instance: 4c1a87de-805e-4878-bd87-56b203e510de] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:19:09 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:19:09 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:09 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:19:09 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:19:09 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:19:09 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:14 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:19 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:19:24 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:26 user nova-compute[71972]: DEBUG nova.compute.manager [req-3a36cdba-1e4e-40d5-a7b2-7e592bf9e7ac req-b24d8f67-834e-44d4-ac04-7a481d5678d8 service nova] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Received event network-changed-2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:19:26 user nova-compute[71972]: DEBUG nova.compute.manager [req-3a36cdba-1e4e-40d5-a7b2-7e592bf9e7ac req-b24d8f67-834e-44d4-ac04-7a481d5678d8 service nova] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Refreshing instance network info cache due to event network-changed-2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:19:26 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-3a36cdba-1e4e-40d5-a7b2-7e592bf9e7ac req-b24d8f67-834e-44d4-ac04-7a481d5678d8 service nova] Acquiring lock "refresh_cache-d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:19:26 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-3a36cdba-1e4e-40d5-a7b2-7e592bf9e7ac req-b24d8f67-834e-44d4-ac04-7a481d5678d8 service nova] Acquired lock "refresh_cache-d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:19:26 user nova-compute[71972]: DEBUG nova.network.neutron [req-3a36cdba-1e4e-40d5-a7b2-7e592bf9e7ac req-b24d8f67-834e-44d4-ac04-7a481d5678d8 service nova] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Refreshing network info cache for port 2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560 {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:19:27 user nova-compute[71972]: DEBUG nova.network.neutron [req-3a36cdba-1e4e-40d5-a7b2-7e592bf9e7ac req-b24d8f67-834e-44d4-ac04-7a481d5678d8 service nova] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Updated VIF entry in instance network info cache for port 2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:19:27 user nova-compute[71972]: DEBUG nova.network.neutron [req-3a36cdba-1e4e-40d5-a7b2-7e592bf9e7ac req-b24d8f67-834e-44d4-ac04-7a481d5678d8 service nova] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Updating instance_info_cache with network_info: [{"id": "2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560", "address": "fa:16:3e:f4:40:f4", "network": {"id": "8ffc4041-9c9c-4b0f-9342-dcd76540515a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-82115333-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.66", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "19f2a3034fe9458682e1242c91e2ce45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fd0b7cd-39", "ovs_interfaceid": "2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:19:27 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-3a36cdba-1e4e-40d5-a7b2-7e592bf9e7ac req-b24d8f67-834e-44d4-ac04-7a481d5678d8 service nova] Releasing lock "refresh_cache-d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:19:27 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:27 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquiring lock "73530c77-9d35-486b-af6b-3773519c4206" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:19:27 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "73530c77-9d35-486b-af6b-3773519c4206" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG nova.compute.manager [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Starting instance... {{(pid=71972) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7f94d201-faf1-4231-a8c4-3e0c1f1b1978 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Acquiring lock "d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7f94d201-faf1-4231-a8c4-3e0c1f1b1978 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7f94d201-faf1-4231-a8c4-3e0c1f1b1978 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Acquiring lock "d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7f94d201-faf1-4231-a8c4-3e0c1f1b1978 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7f94d201-faf1-4231-a8c4-3e0c1f1b1978 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:19:28 user nova-compute[71972]: INFO nova.compute.manager [None req-7f94d201-faf1-4231-a8c4-3e0c1f1b1978 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Terminating instance Apr 17 22:19:28 user nova-compute[71972]: DEBUG nova.compute.manager [None req-7f94d201-faf1-4231-a8c4-3e0c1f1b1978 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Start destroying the instance on the hypervisor. {{(pid=71972) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71972) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 22:19:28 user nova-compute[71972]: INFO nova.compute.claims [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Claim successful on node user Apr 17 22:19:28 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG nova.compute.manager [req-106fca60-8348-4af7-96be-e5d795929fd4 req-5975f450-be8b-4d1a-981b-22e29abbdbb1 service nova] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Received event network-vif-unplugged-2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-106fca60-8348-4af7-96be-e5d795929fd4 req-5975f450-be8b-4d1a-981b-22e29abbdbb1 service nova] Acquiring lock "d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-106fca60-8348-4af7-96be-e5d795929fd4 req-5975f450-be8b-4d1a-981b-22e29abbdbb1 service nova] Lock "d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-106fca60-8348-4af7-96be-e5d795929fd4 req-5975f450-be8b-4d1a-981b-22e29abbdbb1 service nova] Lock "d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG nova.compute.manager [req-106fca60-8348-4af7-96be-e5d795929fd4 req-5975f450-be8b-4d1a-981b-22e29abbdbb1 service nova] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] No waiting events found dispatching network-vif-unplugged-2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG nova.compute.manager [req-106fca60-8348-4af7-96be-e5d795929fd4 req-5975f450-be8b-4d1a-981b-22e29abbdbb1 service nova] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Received event network-vif-unplugged-2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560 for instance with task_state deleting. {{(pid=71972) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.289s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG nova.compute.manager [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Start building networks asynchronously for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG nova.compute.manager [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Allocating IP information in the background. {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG nova.network.neutron [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] allocate_for_instance() {{(pid=71972) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 22:19:28 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 22:19:28 user nova-compute[71972]: DEBUG nova.compute.manager [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Start building block device mappings for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG nova.compute.manager [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Start spawning the instance on the hypervisor. {{(pid=71972) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Creating instance directory {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 22:19:28 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Creating image(s) Apr 17 22:19:28 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquiring lock "/opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "/opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "/opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG nova.policy [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6df5551ab4974747a0412ce089b770b0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '26967ac9e8cb45b6aea04a699d4a1eca', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71972) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.129s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquiring lock "fde46b8e739fd6213d1525690f3ccf27384ee720" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:19:28 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Instance destroyed successfully. Apr 17 22:19:28 user nova-compute[71972]: DEBUG nova.objects.instance [None req-7f94d201-faf1-4231-a8c4-3e0c1f1b1978 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lazy-loading 'resources' on Instance uuid d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-7f94d201-faf1-4231-a8c4-3e0c1f1b1978 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:17:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1878222400',display_name='tempest-AttachVolumeTestJSON-server-1878222400',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-attachvolumetestjson-server-1878222400',id=21,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKioJtk4DRUjErpS+dQkd6Zh3VD3Z5KPplVwI2N0nKmntp65lObvZvx08UYCyY2b8WcQy2GnzCG0FbEpNEq4UnZs1TTLDZZcBxjo7SevW/18tN2cTzyfQlmnoJ85kbLQJQ==',key_name='tempest-keypair-4218837',keypairs=,launch_index=0,launched_at=2023-04-17T22:17:41Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='19f2a3034fe9458682e1242c91e2ce45',ramdisk_id='',reservation_id='r-5o6t1u5a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-AttachVolumeTestJSON-4448958',owner_user_name='tempest-AttachVolumeTestJSON-4448958-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T22:17:42Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='30c12a4244db438ea682e545c378abe1',uuid=d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560", "address": "fa:16:3e:f4:40:f4", "network": {"id": "8ffc4041-9c9c-4b0f-9342-dcd76540515a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-82115333-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.66", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "19f2a3034fe9458682e1242c91e2ce45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fd0b7cd-39", "ovs_interfaceid": "2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-7f94d201-faf1-4231-a8c4-3e0c1f1b1978 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Converting VIF {"id": "2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560", "address": "fa:16:3e:f4:40:f4", "network": {"id": "8ffc4041-9c9c-4b0f-9342-dcd76540515a", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-82115333-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.66", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "19f2a3034fe9458682e1242c91e2ce45", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap2fd0b7cd-39", "ovs_interfaceid": "2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-7f94d201-faf1-4231-a8c4-3e0c1f1b1978 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f4:40:f4,bridge_name='br-int',has_traffic_filtering=True,id=2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560,network=Network(8ffc4041-9c9c-4b0f-9342-dcd76540515a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fd0b7cd-39') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG os_vif [None req-7f94d201-faf1-4231-a8c4-3e0c1f1b1978 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f4:40:f4,bridge_name='br-int',has_traffic_filtering=True,id=2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560,network=Network(8ffc4041-9c9c-4b0f-9342-dcd76540515a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fd0b7cd-39') {{(pid=71972) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2fd0b7cd-39, bridge=br-int, if_exists=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:19:28 user nova-compute[71972]: INFO os_vif [None req-7f94d201-faf1-4231-a8c4-3e0c1f1b1978 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f4:40:f4,bridge_name='br-int',has_traffic_filtering=True,id=2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560,network=Network(8ffc4041-9c9c-4b0f-9342-dcd76540515a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2fd0b7cd-39') Apr 17 22:19:28 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-7f94d201-faf1-4231-a8c4-3e0c1f1b1978 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Deleting instance files /opt/stack/data/nova/instances/d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0_del Apr 17 22:19:28 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-7f94d201-faf1-4231-a8c4-3e0c1f1b1978 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Deletion of /opt/stack/data/nova/instances/d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0_del complete Apr 17 22:19:28 user nova-compute[71972]: INFO nova.compute.manager [None req-7f94d201-faf1-4231-a8c4-3e0c1f1b1978 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Took 0.89 seconds to destroy the instance on the hypervisor. Apr 17 22:19:28 user nova-compute[71972]: DEBUG oslo.service.loopingcall [None req-7f94d201-faf1-4231-a8c4-3e0c1f1b1978 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71972) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG nova.compute.manager [-] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Deallocating network for instance {{(pid=71972) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] deallocate_for_instance() {{(pid=71972) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.158s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/disk 1073741824 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/disk 1073741824" returned: 0 in 0.060s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.222s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:19:28 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:19:29 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.141s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:19:29 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Checking if we can resize image /opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/disk. size=1073741824 {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 22:19:29 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:19:29 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/disk --force-share --output=json" returned: 0 in 0.142s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:19:29 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Cannot resize image /opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/disk to a smaller size. {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 22:19:29 user nova-compute[71972]: DEBUG nova.objects.instance [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lazy-loading 'migration_context' on Instance uuid 73530c77-9d35-486b-af6b-3773519c4206 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:19:29 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Created local disks {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 22:19:29 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Ensure instance console log exists: /opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/console.log {{(pid=71972) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 22:19:29 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:19:29 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:19:29 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:19:29 user nova-compute[71972]: DEBUG nova.network.neutron [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Successfully created port: de79c687-16bd-4f64-a449-70b021e4ea02 {{(pid=71972) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 22:19:29 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:19:29 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Took 0.83 seconds to deallocate network for instance. Apr 17 22:19:29 user nova-compute[71972]: DEBUG nova.compute.manager [req-c2a06520-33de-4c08-8db6-764751a697ca req-9bac1155-6311-452d-8f64-f3e988221bf0 service nova] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Received event network-vif-deleted-2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:19:29 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7f94d201-faf1-4231-a8c4-3e0c1f1b1978 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:19:29 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7f94d201-faf1-4231-a8c4-3e0c1f1b1978 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:19:29 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-7f94d201-faf1-4231-a8c4-3e0c1f1b1978 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:19:29 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-7f94d201-faf1-4231-a8c4-3e0c1f1b1978 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:19:29 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7f94d201-faf1-4231-a8c4-3e0c1f1b1978 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.168s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:19:30 user nova-compute[71972]: INFO nova.scheduler.client.report [None req-7f94d201-faf1-4231-a8c4-3e0c1f1b1978 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Deleted allocations for instance d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0 Apr 17 22:19:30 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-7f94d201-faf1-4231-a8c4-3e0c1f1b1978 tempest-AttachVolumeTestJSON-4448958 tempest-AttachVolumeTestJSON-4448958-project-member] Lock "d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.078s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG nova.network.neutron [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Successfully updated port: de79c687-16bd-4f64-a449-70b021e4ea02 {{(pid=71972) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG nova.compute.manager [req-a241a7f5-b1f0-4437-be41-69899c25038b req-ca5c46be-2672-4261-b6fe-9fa67e2f7437 service nova] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Received event network-vif-plugged-2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-a241a7f5-b1f0-4437-be41-69899c25038b req-ca5c46be-2672-4261-b6fe-9fa67e2f7437 service nova] Acquiring lock "d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-a241a7f5-b1f0-4437-be41-69899c25038b req-ca5c46be-2672-4261-b6fe-9fa67e2f7437 service nova] Lock "d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-a241a7f5-b1f0-4437-be41-69899c25038b req-ca5c46be-2672-4261-b6fe-9fa67e2f7437 service nova] Lock "d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG nova.compute.manager [req-a241a7f5-b1f0-4437-be41-69899c25038b req-ca5c46be-2672-4261-b6fe-9fa67e2f7437 service nova] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] No waiting events found dispatching network-vif-plugged-2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:19:30 user nova-compute[71972]: WARNING nova.compute.manager [req-a241a7f5-b1f0-4437-be41-69899c25038b req-ca5c46be-2672-4261-b6fe-9fa67e2f7437 service nova] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Received unexpected event network-vif-plugged-2fd0b7cd-39e8-4f3e-85bf-dee1bccf6560 for instance with vm_state deleted and task_state None. Apr 17 22:19:30 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquiring lock "refresh_cache-73530c77-9d35-486b-af6b-3773519c4206" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquired lock "refresh_cache-73530c77-9d35-486b-af6b-3773519c4206" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG nova.network.neutron [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Building network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG nova.network.neutron [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Instance cache missing network info. {{(pid=71972) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG nova.network.neutron [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Updating instance_info_cache with network_info: [{"id": "de79c687-16bd-4f64-a449-70b021e4ea02", "address": "fa:16:3e:62:9f:00", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapde79c687-16", "ovs_interfaceid": "de79c687-16bd-4f64-a449-70b021e4ea02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Releasing lock "refresh_cache-73530c77-9d35-486b-af6b-3773519c4206" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG nova.compute.manager [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Instance network_info: |[{"id": "de79c687-16bd-4f64-a449-70b021e4ea02", "address": "fa:16:3e:62:9f:00", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapde79c687-16", "ovs_interfaceid": "de79c687-16bd-4f64-a449-70b021e4ea02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Start _get_guest_xml network_info=[{"id": "de79c687-16bd-4f64-a449-70b021e4ea02", "address": "fa:16:3e:62:9f:00", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapde79c687-16", "ovs_interfaceid": "de79c687-16bd-4f64-a449-70b021e4ea02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'image_id': '80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 22:19:30 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:19:30 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:19:30 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71972) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T22:06:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=), allow threads: True {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Flavor limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Image limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Flavor pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Image pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Got 1 possible topologies {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:19:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1002520753',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1002520753',id=23,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='26967ac9e8cb45b6aea04a699d4a1eca',ramdisk_id='',reservation_id='r-zpu4rruz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-480550513',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:19:29Z,user_data=None,user_id='6df5551ab4974747a0412ce089b770b0',uuid=73530c77-9d35-486b-af6b-3773519c4206,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "de79c687-16bd-4f64-a449-70b021e4ea02", "address": "fa:16:3e:62:9f:00", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapde79c687-16", "ovs_interfaceid": "de79c687-16bd-4f64-a449-70b021e4ea02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71972) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Converting VIF {"id": "de79c687-16bd-4f64-a449-70b021e4ea02", "address": "fa:16:3e:62:9f:00", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapde79c687-16", "ovs_interfaceid": "de79c687-16bd-4f64-a449-70b021e4ea02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:9f:00,bridge_name='br-int',has_traffic_filtering=True,id=de79c687-16bd-4f64-a449-70b021e4ea02,network=Network(bf926b5c-baa2-457d-aab9-e2deea0b84c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde79c687-16') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG nova.objects.instance [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lazy-loading 'pci_devices' on Instance uuid 73530c77-9d35-486b-af6b-3773519c4206 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] End _get_guest_xml xml= Apr 17 22:19:30 user nova-compute[71972]: 73530c77-9d35-486b-af6b-3773519c4206 Apr 17 22:19:30 user nova-compute[71972]: instance-00000017 Apr 17 22:19:30 user nova-compute[71972]: 131072 Apr 17 22:19:30 user nova-compute[71972]: 1 Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: tempest-ServerBootFromVolumeStableRescueTest-server-1002520753 Apr 17 22:19:30 user nova-compute[71972]: 2023-04-17 22:19:30 Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: 128 Apr 17 22:19:30 user nova-compute[71972]: 1 Apr 17 22:19:30 user nova-compute[71972]: 0 Apr 17 22:19:30 user nova-compute[71972]: 0 Apr 17 22:19:30 user nova-compute[71972]: 1 Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member Apr 17 22:19:30 user nova-compute[71972]: tempest-ServerBootFromVolumeStableRescueTest-480550513 Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: OpenStack Foundation Apr 17 22:19:30 user nova-compute[71972]: OpenStack Nova Apr 17 22:19:30 user nova-compute[71972]: 0.0.0 Apr 17 22:19:30 user nova-compute[71972]: 73530c77-9d35-486b-af6b-3773519c4206 Apr 17 22:19:30 user nova-compute[71972]: 73530c77-9d35-486b-af6b-3773519c4206 Apr 17 22:19:30 user nova-compute[71972]: Virtual Machine Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: hvm Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Nehalem Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: /dev/urandom Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: Apr 17 22:19:30 user nova-compute[71972]: {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:19:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1002520753',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1002520753',id=23,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='26967ac9e8cb45b6aea04a699d4a1eca',ramdisk_id='',reservation_id='r-zpu4rruz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-480550513',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:19:29Z,user_data=None,user_id='6df5551ab4974747a0412ce089b770b0',uuid=73530c77-9d35-486b-af6b-3773519c4206,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "de79c687-16bd-4f64-a449-70b021e4ea02", "address": "fa:16:3e:62:9f:00", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapde79c687-16", "ovs_interfaceid": "de79c687-16bd-4f64-a449-70b021e4ea02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Converting VIF {"id": "de79c687-16bd-4f64-a449-70b021e4ea02", "address": "fa:16:3e:62:9f:00", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapde79c687-16", "ovs_interfaceid": "de79c687-16bd-4f64-a449-70b021e4ea02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:9f:00,bridge_name='br-int',has_traffic_filtering=True,id=de79c687-16bd-4f64-a449-70b021e4ea02,network=Network(bf926b5c-baa2-457d-aab9-e2deea0b84c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde79c687-16') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG os_vif [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:9f:00,bridge_name='br-int',has_traffic_filtering=True,id=de79c687-16bd-4f64-a449-70b021e4ea02,network=Network(bf926b5c-baa2-457d-aab9-e2deea0b84c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde79c687-16') {{(pid=71972) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:19:30 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 22:19:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapde79c687-16, may_exist=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:19:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapde79c687-16, col_values=(('external_ids', {'iface-id': 'de79c687-16bd-4f64-a449-70b021e4ea02', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:62:9f:00', 'vm-uuid': '73530c77-9d35-486b-af6b-3773519c4206'}),)) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:19:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:19:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:31 user nova-compute[71972]: INFO os_vif [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:9f:00,bridge_name='br-int',has_traffic_filtering=True,id=de79c687-16bd-4f64-a449-70b021e4ea02,network=Network(bf926b5c-baa2-457d-aab9-e2deea0b84c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde79c687-16') Apr 17 22:19:31 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] No BDM found with device name vda, not building metadata. {{(pid=71972) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 22:19:31 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] No VIF found with MAC fa:16:3e:62:9f:00, not building metadata {{(pid=71972) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 22:19:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:32 user nova-compute[71972]: DEBUG nova.compute.manager [req-22d7fc7c-9ce9-4ed4-bb6f-4b67ca7a90df req-a428fdac-d87d-4ae1-a133-64f5ff78023a service nova] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Received event network-changed-de79c687-16bd-4f64-a449-70b021e4ea02 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:19:32 user nova-compute[71972]: DEBUG nova.compute.manager [req-22d7fc7c-9ce9-4ed4-bb6f-4b67ca7a90df req-a428fdac-d87d-4ae1-a133-64f5ff78023a service nova] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Refreshing instance network info cache due to event network-changed-de79c687-16bd-4f64-a449-70b021e4ea02. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:19:32 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-22d7fc7c-9ce9-4ed4-bb6f-4b67ca7a90df req-a428fdac-d87d-4ae1-a133-64f5ff78023a service nova] Acquiring lock "refresh_cache-73530c77-9d35-486b-af6b-3773519c4206" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:19:32 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-22d7fc7c-9ce9-4ed4-bb6f-4b67ca7a90df req-a428fdac-d87d-4ae1-a133-64f5ff78023a service nova] Acquired lock "refresh_cache-73530c77-9d35-486b-af6b-3773519c4206" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:19:32 user nova-compute[71972]: DEBUG nova.network.neutron [req-22d7fc7c-9ce9-4ed4-bb6f-4b67ca7a90df req-a428fdac-d87d-4ae1-a133-64f5ff78023a service nova] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Refreshing network info cache for port de79c687-16bd-4f64-a449-70b021e4ea02 {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:19:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:33 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:33 user nova-compute[71972]: DEBUG nova.network.neutron [req-22d7fc7c-9ce9-4ed4-bb6f-4b67ca7a90df req-a428fdac-d87d-4ae1-a133-64f5ff78023a service nova] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Updated VIF entry in instance network info cache for port de79c687-16bd-4f64-a449-70b021e4ea02. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:19:33 user nova-compute[71972]: DEBUG nova.network.neutron [req-22d7fc7c-9ce9-4ed4-bb6f-4b67ca7a90df req-a428fdac-d87d-4ae1-a133-64f5ff78023a service nova] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Updating instance_info_cache with network_info: [{"id": "de79c687-16bd-4f64-a449-70b021e4ea02", "address": "fa:16:3e:62:9f:00", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapde79c687-16", "ovs_interfaceid": "de79c687-16bd-4f64-a449-70b021e4ea02", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:19:33 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-22d7fc7c-9ce9-4ed4-bb6f-4b67ca7a90df req-a428fdac-d87d-4ae1-a133-64f5ff78023a service nova] Releasing lock "refresh_cache-73530c77-9d35-486b-af6b-3773519c4206" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:19:34 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Resumed> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:19:34 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 73530c77-9d35-486b-af6b-3773519c4206] VM Resumed (Lifecycle Event) Apr 17 22:19:34 user nova-compute[71972]: DEBUG nova.compute.manager [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Instance event wait completed in 0 seconds for {{(pid=71972) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 22:19:34 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Guest created on hypervisor {{(pid=71972) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 22:19:34 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Instance spawned successfully. Apr 17 22:19:34 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 22:19:34 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:19:34 user nova-compute[71972]: DEBUG nova.compute.manager [req-63e90a32-bfdc-4f80-ab3e-0849cb13f7dc req-095b44c5-8c8e-46aa-8fd2-e1fb100ae7ab service nova] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Received event network-vif-plugged-de79c687-16bd-4f64-a449-70b021e4ea02 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:19:34 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-63e90a32-bfdc-4f80-ab3e-0849cb13f7dc req-095b44c5-8c8e-46aa-8fd2-e1fb100ae7ab service nova] Acquiring lock "73530c77-9d35-486b-af6b-3773519c4206-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:19:34 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-63e90a32-bfdc-4f80-ab3e-0849cb13f7dc req-095b44c5-8c8e-46aa-8fd2-e1fb100ae7ab service nova] Lock "73530c77-9d35-486b-af6b-3773519c4206-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:19:34 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-63e90a32-bfdc-4f80-ab3e-0849cb13f7dc req-095b44c5-8c8e-46aa-8fd2-e1fb100ae7ab service nova] Lock "73530c77-9d35-486b-af6b-3773519c4206-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:19:34 user nova-compute[71972]: DEBUG nova.compute.manager [req-63e90a32-bfdc-4f80-ab3e-0849cb13f7dc req-095b44c5-8c8e-46aa-8fd2-e1fb100ae7ab service nova] [instance: 73530c77-9d35-486b-af6b-3773519c4206] No waiting events found dispatching network-vif-plugged-de79c687-16bd-4f64-a449-70b021e4ea02 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:19:34 user nova-compute[71972]: WARNING nova.compute.manager [req-63e90a32-bfdc-4f80-ab3e-0849cb13f7dc req-095b44c5-8c8e-46aa-8fd2-e1fb100ae7ab service nova] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Received unexpected event network-vif-plugged-de79c687-16bd-4f64-a449-70b021e4ea02 for instance with vm_state building and task_state spawning. Apr 17 22:19:34 user nova-compute[71972]: DEBUG nova.compute.manager [req-63e90a32-bfdc-4f80-ab3e-0849cb13f7dc req-095b44c5-8c8e-46aa-8fd2-e1fb100ae7ab service nova] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Received event network-vif-plugged-de79c687-16bd-4f64-a449-70b021e4ea02 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:19:34 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-63e90a32-bfdc-4f80-ab3e-0849cb13f7dc req-095b44c5-8c8e-46aa-8fd2-e1fb100ae7ab service nova] Acquiring lock "73530c77-9d35-486b-af6b-3773519c4206-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:19:34 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-63e90a32-bfdc-4f80-ab3e-0849cb13f7dc req-095b44c5-8c8e-46aa-8fd2-e1fb100ae7ab service nova] Lock "73530c77-9d35-486b-af6b-3773519c4206-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:19:34 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-63e90a32-bfdc-4f80-ab3e-0849cb13f7dc req-095b44c5-8c8e-46aa-8fd2-e1fb100ae7ab service nova] Lock "73530c77-9d35-486b-af6b-3773519c4206-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:19:34 user nova-compute[71972]: DEBUG nova.compute.manager [req-63e90a32-bfdc-4f80-ab3e-0849cb13f7dc req-095b44c5-8c8e-46aa-8fd2-e1fb100ae7ab service nova] [instance: 73530c77-9d35-486b-af6b-3773519c4206] No waiting events found dispatching network-vif-plugged-de79c687-16bd-4f64-a449-70b021e4ea02 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:19:34 user nova-compute[71972]: WARNING nova.compute.manager [req-63e90a32-bfdc-4f80-ab3e-0849cb13f7dc req-095b44c5-8c8e-46aa-8fd2-e1fb100ae7ab service nova] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Received unexpected event network-vif-plugged-de79c687-16bd-4f64-a449-70b021e4ea02 for instance with vm_state building and task_state spawning. Apr 17 22:19:34 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:19:34 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Found default for hw_cdrom_bus of ide {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:19:34 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Found default for hw_disk_bus of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:19:34 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Found default for hw_input_bus of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:19:34 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Found default for hw_pointer_model of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:19:34 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Found default for hw_video_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:19:34 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Found default for hw_vif_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:19:34 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 73530c77-9d35-486b-af6b-3773519c4206] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:19:34 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Started> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:19:34 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 73530c77-9d35-486b-af6b-3773519c4206] VM Started (Lifecycle Event) Apr 17 22:19:34 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:19:34 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:19:34 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 73530c77-9d35-486b-af6b-3773519c4206] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:19:34 user nova-compute[71972]: INFO nova.compute.manager [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Took 5.95 seconds to spawn the instance on the hypervisor. Apr 17 22:19:34 user nova-compute[71972]: DEBUG nova.compute.manager [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:19:34 user nova-compute[71972]: INFO nova.compute.manager [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Took 6.59 seconds to build instance. Apr 17 22:19:34 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-be0e1735-bca0-4887-a81f-0434be712ea0 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "73530c77-9d35-486b-af6b-3773519c4206" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.695s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:19:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:39 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:19:39 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Cleaning up deleted instances with incomplete migration {{(pid=71972) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 17 22:19:41 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:19:41 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:41 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:19:41 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:19:41 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:19:41 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:41 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:19:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:19:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:19:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:19:41 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Auditing locally available compute resources for user (node: user) {{(pid=71972) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 22:19:41 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:19:41 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:19:41 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:19:42 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:19:42 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:19:42 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:19:42 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:19:42 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:19:42 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:19:42 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:19:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Hypervisor/Node resource view: name=user free_ram=8882MB free_disk=26.56917953491211GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71972) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 22:19:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:19:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:19:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance ac56fdf0-5568-434b-ad61-805634c2beeb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:19:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 73530c77-9d35-486b-af6b-3773519c4206 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:19:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 22:19:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 22:19:42 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:19:42 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:19:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Compute_service record updated for user:user {{(pid=71972) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 22:19:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.241s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:19:43 user nova-compute[71972]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:19:43 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] VM Stopped (Lifecycle Event) Apr 17 22:19:43 user nova-compute[71972]: DEBUG nova.compute.manager [None req-fd427960-5c69-4be6-8933-d06a37c6deac None None] [instance: d4d90f0c-f7d3-4a1e-ac10-7128e6bddcb0] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:19:45 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:19:45 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Starting heal instance info cache {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 22:19:45 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Didn't find any instances for network info cache update. {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 17 22:19:45 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:19:45 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:19:45 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:19:45 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:19:45 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71972) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 22:19:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:19:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:19:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:19:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:19:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:46 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:19:46 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:19:46 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:19:47 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:19:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:49 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:52 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:53 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:19:53 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Cleaning up deleted instances {{(pid=71972) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 17 22:19:53 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] There are 0 instances to clean {{(pid=71972) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 17 22:19:56 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:19:57 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:20:01 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:20:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:20:06 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:20:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:20:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:20:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:20:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:20:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:20:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:20:16 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:20:21 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:20:21 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:20:21 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:20:21 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:20:21 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:20:21 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:20:22 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:20:22 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:20:26 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:20:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:20:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:20:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:20:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:20:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:20:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:20:33 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:20:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:20:37 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:20:41 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:20:41 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:20:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:20:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.003s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:20:41 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:20:41 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Auditing locally available compute resources for user (node: user) {{(pid=71972) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 22:20:41 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:20:41 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json" returned: 0 in 0.140s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:20:41 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:20:42 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:20:42 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:20:42 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:20:42 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:20:42 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:20:42 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:20:42 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:20:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Hypervisor/Node resource view: name=user free_ram=8979MB free_disk=26.538890838623047GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71972) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 22:20:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:20:42 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:20:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance ac56fdf0-5568-434b-ad61-805634c2beeb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:20:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 73530c77-9d35-486b-af6b-3773519c4206 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:20:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 22:20:42 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 22:20:43 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Refreshing inventories for resource provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 17 22:20:43 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Updating ProviderTree inventory for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 17 22:20:43 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Updating inventory in ProviderTree for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 17 22:20:43 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Refreshing aggregate associations for resource provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e, aggregates: None {{(pid=71972) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 17 22:20:43 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Refreshing trait associations for resource provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e, traits: HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_RESCUE_BFV,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NODE,HW_CPU_X86_SSE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE2,COMPUTE_VOLUME_MULTI_ATTACH {{(pid=71972) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 17 22:20:43 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:20:43 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:20:43 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Compute_service record updated for user:user {{(pid=71972) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 22:20:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.517s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:20:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:20:46 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:20:46 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Starting heal instance info cache {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 22:20:46 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Rebuilding the list of instances to heal {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 17 22:20:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "refresh_cache-ac56fdf0-5568-434b-ad61-805634c2beeb" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:20:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquired lock "refresh_cache-ac56fdf0-5568-434b-ad61-805634c2beeb" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:20:46 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Forcefully refreshing network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 17 22:20:46 user nova-compute[71972]: DEBUG nova.objects.instance [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lazy-loading 'info_cache' on Instance uuid ac56fdf0-5568-434b-ad61-805634c2beeb {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:20:46 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Updating instance_info_cache with network_info: [{"id": "4acd5cd6-47e1-4450-8baa-092f9444c03f", "address": "fa:16:3e:5e:2a:3b", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4acd5cd6-47", "ovs_interfaceid": "4acd5cd6-47e1-4450-8baa-092f9444c03f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:20:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Releasing lock "refresh_cache-ac56fdf0-5568-434b-ad61-805634c2beeb" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:20:46 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Updated the network info_cache for instance {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 17 22:20:46 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:20:46 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:20:46 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:20:46 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:20:46 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71972) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 22:20:47 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:20:47 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:20:48 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:20:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:20:52 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:20:56 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:20:56 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:20:56 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:20:56 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:20:56 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:20:56 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:20:57 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:21:01 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:21:06 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:21:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:21:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:21:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:21:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:21:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:21:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:21:16 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:21:16 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:21:16 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:21:16 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:21:16 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:21:16 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:21:19 user nova-compute[71972]: DEBUG nova.compute.manager [None req-74b59b6c-6fd9-4194-bf21-5af22fcdc1d6 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:21:19 user nova-compute[71972]: INFO nova.compute.manager [None req-74b59b6c-6fd9-4194-bf21-5af22fcdc1d6 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] instance snapshotting Apr 17 22:21:19 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-74b59b6c-6fd9-4194-bf21-5af22fcdc1d6 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Beginning live snapshot process Apr 17 22:21:19 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-74b59b6c-6fd9-4194-bf21-5af22fcdc1d6 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/disk --force-share --output=json -f qcow2 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:21:19 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-74b59b6c-6fd9-4194-bf21-5af22fcdc1d6 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/disk --force-share --output=json -f qcow2" returned: 0 in 0.136s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:21:19 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-74b59b6c-6fd9-4194-bf21-5af22fcdc1d6 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/disk --force-share --output=json -f qcow2 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:21:19 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-74b59b6c-6fd9-4194-bf21-5af22fcdc1d6 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/disk --force-share --output=json -f qcow2" returned: 0 in 0.130s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:21:19 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-74b59b6c-6fd9-4194-bf21-5af22fcdc1d6 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:21:19 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-74b59b6c-6fd9-4194-bf21-5af22fcdc1d6 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.131s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:21:19 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-74b59b6c-6fd9-4194-bf21-5af22fcdc1d6 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpvkoub1i8/bfc4b7532ee24bdfb8662c5687d39d72.delta 1073741824 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:21:20 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-74b59b6c-6fd9-4194-bf21-5af22fcdc1d6 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmpvkoub1i8/bfc4b7532ee24bdfb8662c5687d39d72.delta 1073741824" returned: 0 in 0.053s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:21:20 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-74b59b6c-6fd9-4194-bf21-5af22fcdc1d6 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Quiescing instance not available: QEMU guest agent is not enabled. Apr 17 22:21:20 user nova-compute[71972]: DEBUG nova.virt.libvirt.guest [None req-74b59b6c-6fd9-4194-bf21-5af22fcdc1d6 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] COPY block job progress, current cursor: 0 final cursor: 43778048 {{(pid=71972) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 17 22:21:21 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:21:21 user nova-compute[71972]: DEBUG nova.virt.libvirt.guest [None req-74b59b6c-6fd9-4194-bf21-5af22fcdc1d6 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] COPY block job progress, current cursor: 43778048 final cursor: 43778048 {{(pid=71972) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 17 22:21:21 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-74b59b6c-6fd9-4194-bf21-5af22fcdc1d6 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Skipping quiescing instance: QEMU guest agent is not enabled. Apr 17 22:21:21 user nova-compute[71972]: DEBUG nova.privsep.utils [None req-74b59b6c-6fd9-4194-bf21-5af22fcdc1d6 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71972) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 17 22:21:21 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-74b59b6c-6fd9-4194-bf21-5af22fcdc1d6 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpvkoub1i8/bfc4b7532ee24bdfb8662c5687d39d72.delta /opt/stack/data/nova/instances/snapshots/tmpvkoub1i8/bfc4b7532ee24bdfb8662c5687d39d72 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:21:21 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-74b59b6c-6fd9-4194-bf21-5af22fcdc1d6 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmpvkoub1i8/bfc4b7532ee24bdfb8662c5687d39d72.delta /opt/stack/data/nova/instances/snapshots/tmpvkoub1i8/bfc4b7532ee24bdfb8662c5687d39d72" returned: 0 in 0.480s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:21:21 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-74b59b6c-6fd9-4194-bf21-5af22fcdc1d6 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Snapshot extracted, beginning image upload Apr 17 22:21:24 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-74b59b6c-6fd9-4194-bf21-5af22fcdc1d6 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Snapshot image upload complete Apr 17 22:21:24 user nova-compute[71972]: INFO nova.compute.manager [None req-74b59b6c-6fd9-4194-bf21-5af22fcdc1d6 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Took 4.66 seconds to snapshot the instance on the hypervisor. Apr 17 22:21:26 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:21:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:21:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:21:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:21:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:21:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:21:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:21:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:21:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:21:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:21:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:21:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:21:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:21:41 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:21:43 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:21:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:21:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:21:43 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:21:43 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Auditing locally available compute resources for user (node: user) {{(pid=71972) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 22:21:43 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:21:43 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:21:43 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:21:44 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json" returned: 0 in 0.129s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:21:44 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:21:44 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:21:44 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:21:44 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:21:44 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:21:44 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:21:44 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Hypervisor/Node resource view: name=user free_ram=9032MB free_disk=26.50198745727539GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71972) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 22:21:44 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:21:44 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:21:44 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance ac56fdf0-5568-434b-ad61-805634c2beeb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:21:44 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 73530c77-9d35-486b-af6b-3773519c4206 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:21:44 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 22:21:44 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 22:21:44 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:21:44 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:21:44 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Compute_service record updated for user:user {{(pid=71972) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 22:21:44 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.214s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:21:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:21:46 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:21:46 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Starting heal instance info cache {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 22:21:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "refresh_cache-73530c77-9d35-486b-af6b-3773519c4206" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:21:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquired lock "refresh_cache-73530c77-9d35-486b-af6b-3773519c4206" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:21:47 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Forcefully refreshing network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 17 22:21:47 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Updating instance_info_cache with network_info: [{"id": "de79c687-16bd-4f64-a449-70b021e4ea02", "address": "fa:16:3e:62:9f:00", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapde79c687-16", "ovs_interfaceid": "de79c687-16bd-4f64-a449-70b021e4ea02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:21:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Releasing lock "refresh_cache-73530c77-9d35-486b-af6b-3773519c4206" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:21:47 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Updated the network info_cache for instance {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 17 22:21:47 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:21:47 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:21:47 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:21:47 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71972) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 22:21:48 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:21:49 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:21:49 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:21:50 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:21:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:21:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:21:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:21:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:21:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:21:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:21:56 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:22:01 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:22:01 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:22:01 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:22:01 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:22:01 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:22:01 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:22:06 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:22:06 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:22:06 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:22:06 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:22:06 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:22:06 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:22:07 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:22:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:22:16 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:22:16 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:22:16 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:22:16 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:22:16 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:22:16 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:22:21 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:22:25 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquiring lock "1181392c-c6c7-42ae-877a-6e7554531f42" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:22:25 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "1181392c-c6c7-42ae-877a-6e7554531f42" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:22:25 user nova-compute[71972]: DEBUG nova.compute.manager [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Starting instance... {{(pid=71972) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 22:22:26 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:22:26 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:22:26 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71972) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 22:22:26 user nova-compute[71972]: INFO nova.compute.claims [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Claim successful on node user Apr 17 22:22:26 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:22:26 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:22:26 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:22:26 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.256s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:22:26 user nova-compute[71972]: DEBUG nova.compute.manager [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Start building networks asynchronously for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 22:22:26 user nova-compute[71972]: DEBUG nova.compute.manager [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Allocating IP information in the background. {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 22:22:26 user nova-compute[71972]: DEBUG nova.network.neutron [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] allocate_for_instance() {{(pid=71972) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 22:22:26 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 22:22:26 user nova-compute[71972]: DEBUG nova.compute.manager [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Start building block device mappings for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 22:22:26 user nova-compute[71972]: INFO nova.virt.block_device [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Booting with volume-backed-image 80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f at /dev/vda Apr 17 22:22:26 user nova-compute[71972]: DEBUG nova.policy [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6df5551ab4974747a0412ce089b770b0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '26967ac9e8cb45b6aea04a699d4a1eca', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71972) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 22:22:26 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:22:27 user nova-compute[71972]: DEBUG nova.network.neutron [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Successfully created port: c42c70ae-fbe9-4314-9e95-fae3c9e5600a {{(pid=71972) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 22:22:27 user nova-compute[71972]: DEBUG nova.network.neutron [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Successfully updated port: c42c70ae-fbe9-4314-9e95-fae3c9e5600a {{(pid=71972) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 22:22:27 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquiring lock "refresh_cache-1181392c-c6c7-42ae-877a-6e7554531f42" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:22:27 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquired lock "refresh_cache-1181392c-c6c7-42ae-877a-6e7554531f42" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:22:27 user nova-compute[71972]: DEBUG nova.network.neutron [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Building network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 22:22:27 user nova-compute[71972]: DEBUG nova.compute.manager [req-64488806-c2ee-48e8-9cc9-b657a066e720 req-eb4f29a6-ccc6-4105-ad33-69f1320a4f68 service nova] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Received event network-changed-c42c70ae-fbe9-4314-9e95-fae3c9e5600a {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:22:27 user nova-compute[71972]: DEBUG nova.compute.manager [req-64488806-c2ee-48e8-9cc9-b657a066e720 req-eb4f29a6-ccc6-4105-ad33-69f1320a4f68 service nova] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Refreshing instance network info cache due to event network-changed-c42c70ae-fbe9-4314-9e95-fae3c9e5600a. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:22:27 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-64488806-c2ee-48e8-9cc9-b657a066e720 req-eb4f29a6-ccc6-4105-ad33-69f1320a4f68 service nova] Acquiring lock "refresh_cache-1181392c-c6c7-42ae-877a-6e7554531f42" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:22:27 user nova-compute[71972]: DEBUG nova.network.neutron [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Instance cache missing network info. {{(pid=71972) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 22:22:28 user nova-compute[71972]: DEBUG nova.network.neutron [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Updating instance_info_cache with network_info: [{"id": "c42c70ae-fbe9-4314-9e95-fae3c9e5600a", "address": "fa:16:3e:4c:46:fe", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc42c70ae-fb", "ovs_interfaceid": "c42c70ae-fbe9-4314-9e95-fae3c9e5600a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:22:28 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Releasing lock "refresh_cache-1181392c-c6c7-42ae-877a-6e7554531f42" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:22:28 user nova-compute[71972]: DEBUG nova.compute.manager [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Instance network_info: |[{"id": "c42c70ae-fbe9-4314-9e95-fae3c9e5600a", "address": "fa:16:3e:4c:46:fe", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc42c70ae-fb", "ovs_interfaceid": "c42c70ae-fbe9-4314-9e95-fae3c9e5600a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 22:22:28 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-64488806-c2ee-48e8-9cc9-b657a066e720 req-eb4f29a6-ccc6-4105-ad33-69f1320a4f68 service nova] Acquired lock "refresh_cache-1181392c-c6c7-42ae-877a-6e7554531f42" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:22:28 user nova-compute[71972]: DEBUG nova.network.neutron [req-64488806-c2ee-48e8-9cc9-b657a066e720 req-eb4f29a6-ccc6-4105-ad33-69f1320a4f68 service nova] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Refreshing network info cache for port c42c70ae-fbe9-4314-9e95-fae3c9e5600a {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:22:28 user nova-compute[71972]: DEBUG nova.network.neutron [req-64488806-c2ee-48e8-9cc9-b657a066e720 req-eb4f29a6-ccc6-4105-ad33-69f1320a4f68 service nova] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Updated VIF entry in instance network info cache for port c42c70ae-fbe9-4314-9e95-fae3c9e5600a. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:22:28 user nova-compute[71972]: DEBUG nova.network.neutron [req-64488806-c2ee-48e8-9cc9-b657a066e720 req-eb4f29a6-ccc6-4105-ad33-69f1320a4f68 service nova] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Updating instance_info_cache with network_info: [{"id": "c42c70ae-fbe9-4314-9e95-fae3c9e5600a", "address": "fa:16:3e:4c:46:fe", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc42c70ae-fb", "ovs_interfaceid": "c42c70ae-fbe9-4314-9e95-fae3c9e5600a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:22:28 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-64488806-c2ee-48e8-9cc9-b657a066e720 req-eb4f29a6-ccc6-4105-ad33-69f1320a4f68 service nova] Releasing lock "refresh_cache-1181392c-c6c7-42ae-877a-6e7554531f42" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:22:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:22:31 user nova-compute[71972]: WARNING nova.compute.manager [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Volume id: 19ce8ba7-19cd-4b34-a0f9-a8c9510682a5 finished being created but its status is error. Apr 17 22:22:31 user nova-compute[71972]: ERROR nova.compute.manager [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Instance failed block device setup: nova.exception.VolumeNotCreated: Volume 19ce8ba7-19cd-4b34-a0f9-a8c9510682a5 did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. Apr 17 22:22:31 user nova-compute[71972]: ERROR nova.compute.manager [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Traceback (most recent call last): Apr 17 22:22:31 user nova-compute[71972]: ERROR nova.compute.manager [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] File "/opt/stack/nova/nova/compute/manager.py", line 2175, in _prep_block_device Apr 17 22:22:31 user nova-compute[71972]: ERROR nova.compute.manager [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] driver_block_device.attach_block_devices( Apr 17 22:22:31 user nova-compute[71972]: ERROR nova.compute.manager [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] File "/opt/stack/nova/nova/virt/block_device.py", line 936, in attach_block_devices Apr 17 22:22:31 user nova-compute[71972]: ERROR nova.compute.manager [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] _log_and_attach(device) Apr 17 22:22:31 user nova-compute[71972]: ERROR nova.compute.manager [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] File "/opt/stack/nova/nova/virt/block_device.py", line 933, in _log_and_attach Apr 17 22:22:31 user nova-compute[71972]: ERROR nova.compute.manager [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] bdm.attach(*attach_args, **attach_kwargs) Apr 17 22:22:31 user nova-compute[71972]: ERROR nova.compute.manager [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] File "/opt/stack/nova/nova/virt/block_device.py", line 831, in attach Apr 17 22:22:31 user nova-compute[71972]: ERROR nova.compute.manager [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] self.volume_id, self.attachment_id = self._create_volume( Apr 17 22:22:31 user nova-compute[71972]: ERROR nova.compute.manager [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] File "/opt/stack/nova/nova/virt/block_device.py", line 435, in _create_volume Apr 17 22:22:31 user nova-compute[71972]: ERROR nova.compute.manager [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] self._call_wait_func(context, wait_func, volume_api, vol['id']) Apr 17 22:22:31 user nova-compute[71972]: ERROR nova.compute.manager [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] File "/opt/stack/nova/nova/virt/block_device.py", line 785, in _call_wait_func Apr 17 22:22:31 user nova-compute[71972]: ERROR nova.compute.manager [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] with excutils.save_and_reraise_exception(): Apr 17 22:22:31 user nova-compute[71972]: ERROR nova.compute.manager [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ Apr 17 22:22:31 user nova-compute[71972]: ERROR nova.compute.manager [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] self.force_reraise() Apr 17 22:22:31 user nova-compute[71972]: ERROR nova.compute.manager [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise Apr 17 22:22:31 user nova-compute[71972]: ERROR nova.compute.manager [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] raise self.value Apr 17 22:22:31 user nova-compute[71972]: ERROR nova.compute.manager [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] File "/opt/stack/nova/nova/virt/block_device.py", line 783, in _call_wait_func Apr 17 22:22:31 user nova-compute[71972]: ERROR nova.compute.manager [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] wait_func(context, volume_id) Apr 17 22:22:31 user nova-compute[71972]: ERROR nova.compute.manager [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] File "/opt/stack/nova/nova/compute/manager.py", line 1792, in _await_block_device_map_created Apr 17 22:22:31 user nova-compute[71972]: ERROR nova.compute.manager [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] raise exception.VolumeNotCreated(volume_id=vol_id, Apr 17 22:22:31 user nova-compute[71972]: ERROR nova.compute.manager [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] nova.exception.VolumeNotCreated: Volume 19ce8ba7-19cd-4b34-a0f9-a8c9510682a5 did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. Apr 17 22:22:31 user nova-compute[71972]: ERROR nova.compute.manager [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Apr 17 22:22:31 user nova-compute[71972]: DEBUG nova.compute.claims [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Aborting claim: {{(pid=71972) abort /opt/stack/nova/nova/compute/claims.py:84}} Apr 17 22:22:31 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:22:31 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:22:32 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:22:32 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:22:32 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.236s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:22:32 user nova-compute[71972]: DEBUG nova.compute.manager [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Build of instance 1181392c-c6c7-42ae-877a-6e7554531f42 aborted: Volume 19ce8ba7-19cd-4b34-a0f9-a8c9510682a5 did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. {{(pid=71972) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2636}} Apr 17 22:22:32 user nova-compute[71972]: DEBUG nova.compute.utils [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Build of instance 1181392c-c6c7-42ae-877a-6e7554531f42 aborted: Volume 19ce8ba7-19cd-4b34-a0f9-a8c9510682a5 did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. {{(pid=71972) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} Apr 17 22:22:32 user nova-compute[71972]: ERROR nova.compute.manager [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Build of instance 1181392c-c6c7-42ae-877a-6e7554531f42 aborted: Volume 19ce8ba7-19cd-4b34-a0f9-a8c9510682a5 did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error.: nova.exception.BuildAbortException: Build of instance 1181392c-c6c7-42ae-877a-6e7554531f42 aborted: Volume 19ce8ba7-19cd-4b34-a0f9-a8c9510682a5 did not finish being created even after we waited 5 seconds or 2 attempts. And its status is error. Apr 17 22:22:32 user nova-compute[71972]: DEBUG nova.compute.manager [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Unplugging VIFs for instance {{(pid=71972) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2961}} Apr 17 22:22:32 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:22:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-612198680',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-612198680',id=24,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='26967ac9e8cb45b6aea04a699d4a1eca',ramdisk_id='',reservation_id='r-uywt54ye',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-480550513',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member'},tags=TagList,task_state='block_device_mapping',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:22:26Z,user_data=None,user_id='6df5551ab4974747a0412ce089b770b0',uuid=1181392c-c6c7-42ae-877a-6e7554531f42,vcpu_model=None,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c42c70ae-fbe9-4314-9e95-fae3c9e5600a", "address": "fa:16:3e:4c:46:fe", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc42c70ae-fb", "ovs_interfaceid": "c42c70ae-fbe9-4314-9e95-fae3c9e5600a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 22:22:32 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Converting VIF {"id": "c42c70ae-fbe9-4314-9e95-fae3c9e5600a", "address": "fa:16:3e:4c:46:fe", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapc42c70ae-fb", "ovs_interfaceid": "c42c70ae-fbe9-4314-9e95-fae3c9e5600a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:22:32 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4c:46:fe,bridge_name='br-int',has_traffic_filtering=True,id=c42c70ae-fbe9-4314-9e95-fae3c9e5600a,network=Network(bf926b5c-baa2-457d-aab9-e2deea0b84c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc42c70ae-fb') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:22:32 user nova-compute[71972]: DEBUG os_vif [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4c:46:fe,bridge_name='br-int',has_traffic_filtering=True,id=c42c70ae-fbe9-4314-9e95-fae3c9e5600a,network=Network(bf926b5c-baa2-457d-aab9-e2deea0b84c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc42c70ae-fb') {{(pid=71972) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 22:22:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:22:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc42c70ae-fb, bridge=br-int, if_exists=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:22:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 22:22:32 user nova-compute[71972]: INFO os_vif [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4c:46:fe,bridge_name='br-int',has_traffic_filtering=True,id=c42c70ae-fbe9-4314-9e95-fae3c9e5600a,network=Network(bf926b5c-baa2-457d-aab9-e2deea0b84c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc42c70ae-fb') Apr 17 22:22:32 user nova-compute[71972]: DEBUG nova.compute.manager [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Unplugged VIFs for instance {{(pid=71972) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2997}} Apr 17 22:22:32 user nova-compute[71972]: DEBUG nova.compute.manager [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Deallocating network for instance {{(pid=71972) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 22:22:32 user nova-compute[71972]: DEBUG nova.network.neutron [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] deallocate_for_instance() {{(pid=71972) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 22:22:32 user nova-compute[71972]: DEBUG nova.network.neutron [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:22:32 user nova-compute[71972]: INFO nova.compute.manager [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 1181392c-c6c7-42ae-877a-6e7554531f42] Took 0.59 seconds to deallocate network for instance. Apr 17 22:22:33 user nova-compute[71972]: INFO nova.scheduler.client.report [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Deleted allocations for instance 1181392c-c6c7-42ae-877a-6e7554531f42 Apr 17 22:22:33 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-b9e7e9cc-c0a0-45f8-9854-d5ab07a4fc33 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "1181392c-c6c7-42ae-877a-6e7554531f42" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.090s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:22:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 3912-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:22:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:22:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:22:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:22:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:22:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:22:41 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:22:44 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:22:44 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:22:44 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:22:44 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:22:44 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Auditing locally available compute resources for user (node: user) {{(pid=71972) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 22:22:44 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:22:44 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:22:44 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:22:45 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json" returned: 0 in 0.137s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:22:45 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:22:45 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:22:45 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:22:45 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:22:45 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:22:45 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:22:45 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Hypervisor/Node resource view: name=user free_ram=9064MB free_disk=26.50100326538086GB free_vcpus=10 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71972) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 22:22:45 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:22:45 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:22:45 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance ac56fdf0-5568-434b-ad61-805634c2beeb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:22:45 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 73530c77-9d35-486b-af6b-3773519c4206 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:22:45 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Total usable vcpus: 12, total allocated vcpus: 2 {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 22:22:45 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Final resource view: name=user phys_ram=16023MB used_ram=768MB phys_disk=40GB used_disk=2GB total_vcpus=12 used_vcpus=2 pci_stats=[] {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 22:22:45 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:22:45 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:22:45 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Compute_service record updated for user:user {{(pid=71972) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 22:22:45 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.249s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:22:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:22:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:22:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:22:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:22:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:22:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:22:46 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:22:46 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Starting heal instance info cache {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 22:22:46 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Rebuilding the list of instances to heal {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 17 22:22:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "refresh_cache-ac56fdf0-5568-434b-ad61-805634c2beeb" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:22:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquired lock "refresh_cache-ac56fdf0-5568-434b-ad61-805634c2beeb" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:22:46 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Forcefully refreshing network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 17 22:22:46 user nova-compute[71972]: DEBUG nova.objects.instance [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lazy-loading 'info_cache' on Instance uuid ac56fdf0-5568-434b-ad61-805634c2beeb {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:22:47 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Updating instance_info_cache with network_info: [{"id": "4acd5cd6-47e1-4450-8baa-092f9444c03f", "address": "fa:16:3e:5e:2a:3b", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4acd5cd6-47", "ovs_interfaceid": "4acd5cd6-47e1-4450-8baa-092f9444c03f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:22:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Releasing lock "refresh_cache-ac56fdf0-5568-434b-ad61-805634c2beeb" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:22:47 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Updated the network info_cache for instance {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 17 22:22:47 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:22:47 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:22:48 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:22:48 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:22:48 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71972) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 22:22:49 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:22:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:22:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:22:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:22:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:22:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:22:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:22:51 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:22:52 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:22:55 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:22:56 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:23:01 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:23:01 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:23:01 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:23:01 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:23:01 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:23:01 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:23:06 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:23:06 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:23:06 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:23:06 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:23:06 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:23:06 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:23:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:23:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:23:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:23:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:23:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:23:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:23:16 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:23:16 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-293546c8-ec1e-442f-9637-113dafbc3968 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquiring lock "73530c77-9d35-486b-af6b-3773519c4206" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:23:16 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-293546c8-ec1e-442f-9637-113dafbc3968 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "73530c77-9d35-486b-af6b-3773519c4206" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:23:16 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-293546c8-ec1e-442f-9637-113dafbc3968 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquiring lock "73530c77-9d35-486b-af6b-3773519c4206-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:23:16 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-293546c8-ec1e-442f-9637-113dafbc3968 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "73530c77-9d35-486b-af6b-3773519c4206-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:23:16 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-293546c8-ec1e-442f-9637-113dafbc3968 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "73530c77-9d35-486b-af6b-3773519c4206-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:23:16 user nova-compute[71972]: INFO nova.compute.manager [None req-293546c8-ec1e-442f-9637-113dafbc3968 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Terminating instance Apr 17 22:23:16 user nova-compute[71972]: DEBUG nova.compute.manager [None req-293546c8-ec1e-442f-9637-113dafbc3968 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Start destroying the instance on the hypervisor. {{(pid=71972) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 22:23:16 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:23:17 user nova-compute[71972]: DEBUG nova.compute.manager [req-8b772d78-fc24-4dd9-a5f0-3308f32f2a39 req-9129321c-95f5-4b50-bad1-d9492f8f8fb1 service nova] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Received event network-vif-unplugged-de79c687-16bd-4f64-a449-70b021e4ea02 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:23:17 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-8b772d78-fc24-4dd9-a5f0-3308f32f2a39 req-9129321c-95f5-4b50-bad1-d9492f8f8fb1 service nova] Acquiring lock "73530c77-9d35-486b-af6b-3773519c4206-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:23:17 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-8b772d78-fc24-4dd9-a5f0-3308f32f2a39 req-9129321c-95f5-4b50-bad1-d9492f8f8fb1 service nova] Lock "73530c77-9d35-486b-af6b-3773519c4206-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:23:17 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-8b772d78-fc24-4dd9-a5f0-3308f32f2a39 req-9129321c-95f5-4b50-bad1-d9492f8f8fb1 service nova] Lock "73530c77-9d35-486b-af6b-3773519c4206-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:23:17 user nova-compute[71972]: DEBUG nova.compute.manager [req-8b772d78-fc24-4dd9-a5f0-3308f32f2a39 req-9129321c-95f5-4b50-bad1-d9492f8f8fb1 service nova] [instance: 73530c77-9d35-486b-af6b-3773519c4206] No waiting events found dispatching network-vif-unplugged-de79c687-16bd-4f64-a449-70b021e4ea02 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:23:17 user nova-compute[71972]: DEBUG nova.compute.manager [req-8b772d78-fc24-4dd9-a5f0-3308f32f2a39 req-9129321c-95f5-4b50-bad1-d9492f8f8fb1 service nova] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Received event network-vif-unplugged-de79c687-16bd-4f64-a449-70b021e4ea02 for instance with task_state deleting. {{(pid=71972) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 22:23:17 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:23:17 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:23:17 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Instance destroyed successfully. Apr 17 22:23:17 user nova-compute[71972]: DEBUG nova.objects.instance [None req-293546c8-ec1e-442f-9637-113dafbc3968 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lazy-loading 'resources' on Instance uuid 73530c77-9d35-486b-af6b-3773519c4206 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:23:17 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-293546c8-ec1e-442f-9637-113dafbc3968 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:19:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1002520753',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1002520753',id=23,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-17T22:19:34Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='26967ac9e8cb45b6aea04a699d4a1eca',ramdisk_id='',reservation_id='r-zpu4rruz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-480550513',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T22:21:24Z,user_data=None,user_id='6df5551ab4974747a0412ce089b770b0',uuid=73530c77-9d35-486b-af6b-3773519c4206,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "de79c687-16bd-4f64-a449-70b021e4ea02", "address": "fa:16:3e:62:9f:00", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapde79c687-16", "ovs_interfaceid": "de79c687-16bd-4f64-a449-70b021e4ea02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 22:23:17 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-293546c8-ec1e-442f-9637-113dafbc3968 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Converting VIF {"id": "de79c687-16bd-4f64-a449-70b021e4ea02", "address": "fa:16:3e:62:9f:00", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapde79c687-16", "ovs_interfaceid": "de79c687-16bd-4f64-a449-70b021e4ea02", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:23:17 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-293546c8-ec1e-442f-9637-113dafbc3968 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:62:9f:00,bridge_name='br-int',has_traffic_filtering=True,id=de79c687-16bd-4f64-a449-70b021e4ea02,network=Network(bf926b5c-baa2-457d-aab9-e2deea0b84c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde79c687-16') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:23:17 user nova-compute[71972]: DEBUG os_vif [None req-293546c8-ec1e-442f-9637-113dafbc3968 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:62:9f:00,bridge_name='br-int',has_traffic_filtering=True,id=de79c687-16bd-4f64-a449-70b021e4ea02,network=Network(bf926b5c-baa2-457d-aab9-e2deea0b84c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde79c687-16') {{(pid=71972) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 22:23:17 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:23:17 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapde79c687-16, bridge=br-int, if_exists=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:23:17 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:23:17 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:23:17 user nova-compute[71972]: INFO os_vif [None req-293546c8-ec1e-442f-9637-113dafbc3968 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:62:9f:00,bridge_name='br-int',has_traffic_filtering=True,id=de79c687-16bd-4f64-a449-70b021e4ea02,network=Network(bf926b5c-baa2-457d-aab9-e2deea0b84c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapde79c687-16') Apr 17 22:23:17 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-293546c8-ec1e-442f-9637-113dafbc3968 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Deleting instance files /opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206_del Apr 17 22:23:17 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-293546c8-ec1e-442f-9637-113dafbc3968 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Deletion of /opt/stack/data/nova/instances/73530c77-9d35-486b-af6b-3773519c4206_del complete Apr 17 22:23:17 user nova-compute[71972]: INFO nova.compute.manager [None req-293546c8-ec1e-442f-9637-113dafbc3968 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Took 0.66 seconds to destroy the instance on the hypervisor. Apr 17 22:23:17 user nova-compute[71972]: DEBUG oslo.service.loopingcall [None req-293546c8-ec1e-442f-9637-113dafbc3968 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71972) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 22:23:17 user nova-compute[71972]: DEBUG nova.compute.manager [-] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Deallocating network for instance {{(pid=71972) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 22:23:17 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: 73530c77-9d35-486b-af6b-3773519c4206] deallocate_for_instance() {{(pid=71972) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 22:23:17 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:23:17 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Took 0.52 seconds to deallocate network for instance. Apr 17 22:23:18 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-293546c8-ec1e-442f-9637-113dafbc3968 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:23:18 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-293546c8-ec1e-442f-9637-113dafbc3968 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:23:18 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-293546c8-ec1e-442f-9637-113dafbc3968 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:23:18 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-293546c8-ec1e-442f-9637-113dafbc3968 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:23:18 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-293546c8-ec1e-442f-9637-113dafbc3968 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.137s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:23:18 user nova-compute[71972]: INFO nova.scheduler.client.report [None req-293546c8-ec1e-442f-9637-113dafbc3968 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Deleted allocations for instance 73530c77-9d35-486b-af6b-3773519c4206 Apr 17 22:23:18 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-293546c8-ec1e-442f-9637-113dafbc3968 tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "73530c77-9d35-486b-af6b-3773519c4206" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.500s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:23:19 user nova-compute[71972]: DEBUG nova.compute.manager [req-26c77329-f437-47d7-9638-04977b1034a1 req-15023c2a-1767-4be9-8fb1-69e9466bc8cd service nova] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Received event network-vif-plugged-de79c687-16bd-4f64-a449-70b021e4ea02 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:23:19 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-26c77329-f437-47d7-9638-04977b1034a1 req-15023c2a-1767-4be9-8fb1-69e9466bc8cd service nova] Acquiring lock "73530c77-9d35-486b-af6b-3773519c4206-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:23:19 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-26c77329-f437-47d7-9638-04977b1034a1 req-15023c2a-1767-4be9-8fb1-69e9466bc8cd service nova] Lock "73530c77-9d35-486b-af6b-3773519c4206-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:23:19 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-26c77329-f437-47d7-9638-04977b1034a1 req-15023c2a-1767-4be9-8fb1-69e9466bc8cd service nova] Lock "73530c77-9d35-486b-af6b-3773519c4206-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:23:19 user nova-compute[71972]: DEBUG nova.compute.manager [req-26c77329-f437-47d7-9638-04977b1034a1 req-15023c2a-1767-4be9-8fb1-69e9466bc8cd service nova] [instance: 73530c77-9d35-486b-af6b-3773519c4206] No waiting events found dispatching network-vif-plugged-de79c687-16bd-4f64-a449-70b021e4ea02 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:23:19 user nova-compute[71972]: WARNING nova.compute.manager [req-26c77329-f437-47d7-9638-04977b1034a1 req-15023c2a-1767-4be9-8fb1-69e9466bc8cd service nova] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Received unexpected event network-vif-plugged-de79c687-16bd-4f64-a449-70b021e4ea02 for instance with vm_state deleted and task_state None. Apr 17 22:23:19 user nova-compute[71972]: DEBUG nova.compute.manager [req-26c77329-f437-47d7-9638-04977b1034a1 req-15023c2a-1767-4be9-8fb1-69e9466bc8cd service nova] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Received event network-vif-deleted-de79c687-16bd-4f64-a449-70b021e4ea02 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:23:22 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:23:27 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:23:32 user nova-compute[71972]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:23:32 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: 73530c77-9d35-486b-af6b-3773519c4206] VM Stopped (Lifecycle Event) Apr 17 22:23:32 user nova-compute[71972]: DEBUG nova.compute.manager [None req-b9c7eca7-b174-46a2-bfce-8e787932a2be None None] [instance: 73530c77-9d35-486b-af6b-3773519c4206] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:23:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:23:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:23:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:23:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:23:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:23:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:23:37 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:23:42 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:23:46 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:23:46 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Starting heal instance info cache {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 22:23:46 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Didn't find any instances for network info cache update. {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 17 22:23:46 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:23:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:23:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:23:46 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:23:46 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Auditing locally available compute resources for user (node: user) {{(pid=71972) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 22:23:46 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:23:46 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:23:46 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:23:47 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb/disk --force-share --output=json" returned: 0 in 0.145s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:23:47 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:23:47 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:23:47 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:23:47 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Hypervisor/Node resource view: name=user free_ram=9114MB free_disk=26.555728912353516GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71972) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 22:23:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:23:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:23:47 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance ac56fdf0-5568-434b-ad61-805634c2beeb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:23:47 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 22:23:47 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 22:23:47 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:23:47 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:23:47 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Compute_service record updated for user:user {{(pid=71972) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 22:23:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.232s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:23:48 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:23:48 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:23:48 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:23:50 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:23:50 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71972) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 22:23:51 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:23:51 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:23:52 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:23:52 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:23:57 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:24:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:24:07 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:24:07 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-78f24c86-99ca-4f42-8de2-a3dab641e3da tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquiring lock "ac56fdf0-5568-434b-ad61-805634c2beeb" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:24:07 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-78f24c86-99ca-4f42-8de2-a3dab641e3da tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "ac56fdf0-5568-434b-ad61-805634c2beeb" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.002s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:24:07 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-78f24c86-99ca-4f42-8de2-a3dab641e3da tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquiring lock "ac56fdf0-5568-434b-ad61-805634c2beeb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:24:07 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-78f24c86-99ca-4f42-8de2-a3dab641e3da tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "ac56fdf0-5568-434b-ad61-805634c2beeb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:24:07 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-78f24c86-99ca-4f42-8de2-a3dab641e3da tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "ac56fdf0-5568-434b-ad61-805634c2beeb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:24:07 user nova-compute[71972]: INFO nova.compute.manager [None req-78f24c86-99ca-4f42-8de2-a3dab641e3da tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Terminating instance Apr 17 22:24:07 user nova-compute[71972]: DEBUG nova.compute.manager [None req-78f24c86-99ca-4f42-8de2-a3dab641e3da tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Start destroying the instance on the hypervisor. {{(pid=71972) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 22:24:07 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:24:07 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:24:07 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:24:07 user nova-compute[71972]: DEBUG nova.compute.manager [req-abc47408-f8ca-4422-96e8-039035f2041f req-645fa4a3-ce3d-473f-86f2-93827c10b7d9 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Received event network-vif-unplugged-4acd5cd6-47e1-4450-8baa-092f9444c03f {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:24:07 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-abc47408-f8ca-4422-96e8-039035f2041f req-645fa4a3-ce3d-473f-86f2-93827c10b7d9 service nova] Acquiring lock "ac56fdf0-5568-434b-ad61-805634c2beeb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:24:07 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-abc47408-f8ca-4422-96e8-039035f2041f req-645fa4a3-ce3d-473f-86f2-93827c10b7d9 service nova] Lock "ac56fdf0-5568-434b-ad61-805634c2beeb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:24:07 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-abc47408-f8ca-4422-96e8-039035f2041f req-645fa4a3-ce3d-473f-86f2-93827c10b7d9 service nova] Lock "ac56fdf0-5568-434b-ad61-805634c2beeb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:24:07 user nova-compute[71972]: DEBUG nova.compute.manager [req-abc47408-f8ca-4422-96e8-039035f2041f req-645fa4a3-ce3d-473f-86f2-93827c10b7d9 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] No waiting events found dispatching network-vif-unplugged-4acd5cd6-47e1-4450-8baa-092f9444c03f {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:24:07 user nova-compute[71972]: DEBUG nova.compute.manager [req-abc47408-f8ca-4422-96e8-039035f2041f req-645fa4a3-ce3d-473f-86f2-93827c10b7d9 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Received event network-vif-unplugged-4acd5cd6-47e1-4450-8baa-092f9444c03f for instance with task_state deleting. {{(pid=71972) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 22:24:08 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:24:08 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:24:08 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:24:08 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:24:08 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:24:08 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Instance destroyed successfully. Apr 17 22:24:08 user nova-compute[71972]: DEBUG nova.objects.instance [None req-78f24c86-99ca-4f42-8de2-a3dab641e3da tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lazy-loading 'resources' on Instance uuid ac56fdf0-5568-434b-ad61-805634c2beeb {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:24:08 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-78f24c86-99ca-4f42-8de2-a3dab641e3da tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:15:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-ServerBootFromVolumeStableRescueTest-server-1160681713',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverbootfromvolumestablerescuetest-server-1160681713',id=19,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2023-04-17T22:15:44Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='26967ac9e8cb45b6aea04a699d4a1eca',ramdisk_id='',reservation_id='r-lp93sb9o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerBootFromVolumeStableRescueTest-480550513',owner_user_name='tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T22:17:34Z,user_data=None,user_id='6df5551ab4974747a0412ce089b770b0',uuid=ac56fdf0-5568-434b-ad61-805634c2beeb,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4acd5cd6-47e1-4450-8baa-092f9444c03f", "address": "fa:16:3e:5e:2a:3b", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4acd5cd6-47", "ovs_interfaceid": "4acd5cd6-47e1-4450-8baa-092f9444c03f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 22:24:08 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-78f24c86-99ca-4f42-8de2-a3dab641e3da tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Converting VIF {"id": "4acd5cd6-47e1-4450-8baa-092f9444c03f", "address": "fa:16:3e:5e:2a:3b", "network": {"id": "bf926b5c-baa2-457d-aab9-e2deea0b84c9", "bridge": "br-int", "label": "tempest-ServerBootFromVolumeStableRescueTest-299625319-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "26967ac9e8cb45b6aea04a699d4a1eca", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tap4acd5cd6-47", "ovs_interfaceid": "4acd5cd6-47e1-4450-8baa-092f9444c03f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:24:08 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-78f24c86-99ca-4f42-8de2-a3dab641e3da tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5e:2a:3b,bridge_name='br-int',has_traffic_filtering=True,id=4acd5cd6-47e1-4450-8baa-092f9444c03f,network=Network(bf926b5c-baa2-457d-aab9-e2deea0b84c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4acd5cd6-47') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:24:08 user nova-compute[71972]: DEBUG os_vif [None req-78f24c86-99ca-4f42-8de2-a3dab641e3da tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5e:2a:3b,bridge_name='br-int',has_traffic_filtering=True,id=4acd5cd6-47e1-4450-8baa-092f9444c03f,network=Network(bf926b5c-baa2-457d-aab9-e2deea0b84c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4acd5cd6-47') {{(pid=71972) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 22:24:08 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:24:08 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4acd5cd6-47, bridge=br-int, if_exists=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:24:08 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:24:08 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:24:08 user nova-compute[71972]: INFO os_vif [None req-78f24c86-99ca-4f42-8de2-a3dab641e3da tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5e:2a:3b,bridge_name='br-int',has_traffic_filtering=True,id=4acd5cd6-47e1-4450-8baa-092f9444c03f,network=Network(bf926b5c-baa2-457d-aab9-e2deea0b84c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4acd5cd6-47') Apr 17 22:24:08 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-78f24c86-99ca-4f42-8de2-a3dab641e3da tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Deleting instance files /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb_del Apr 17 22:24:08 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-78f24c86-99ca-4f42-8de2-a3dab641e3da tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Deletion of /opt/stack/data/nova/instances/ac56fdf0-5568-434b-ad61-805634c2beeb_del complete Apr 17 22:24:08 user nova-compute[71972]: INFO nova.compute.manager [None req-78f24c86-99ca-4f42-8de2-a3dab641e3da tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Took 0.86 seconds to destroy the instance on the hypervisor. Apr 17 22:24:08 user nova-compute[71972]: DEBUG oslo.service.loopingcall [None req-78f24c86-99ca-4f42-8de2-a3dab641e3da tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71972) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 22:24:08 user nova-compute[71972]: DEBUG nova.compute.manager [-] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Deallocating network for instance {{(pid=71972) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 22:24:08 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] deallocate_for_instance() {{(pid=71972) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 22:24:08 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:24:08 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:24:08 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:24:08 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:24:08 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:24:08 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:24:08 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:24:09 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Took 0.63 seconds to deallocate network for instance. Apr 17 22:24:09 user nova-compute[71972]: DEBUG nova.compute.manager [req-fe451924-a2ea-4133-8626-8a42bd2ec8a8 req-b9a18941-0692-4e38-81d7-854338f6ae11 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Received event network-vif-deleted-4acd5cd6-47e1-4450-8baa-092f9444c03f {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:24:09 user nova-compute[71972]: INFO nova.compute.manager [req-fe451924-a2ea-4133-8626-8a42bd2ec8a8 req-b9a18941-0692-4e38-81d7-854338f6ae11 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Neutron deleted interface 4acd5cd6-47e1-4450-8baa-092f9444c03f; detaching it from the instance and deleting it from the info cache Apr 17 22:24:09 user nova-compute[71972]: DEBUG nova.network.neutron [req-fe451924-a2ea-4133-8626-8a42bd2ec8a8 req-b9a18941-0692-4e38-81d7-854338f6ae11 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:24:09 user nova-compute[71972]: DEBUG nova.compute.manager [req-fe451924-a2ea-4133-8626-8a42bd2ec8a8 req-b9a18941-0692-4e38-81d7-854338f6ae11 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Detach interface failed, port_id=4acd5cd6-47e1-4450-8baa-092f9444c03f, reason: Instance ac56fdf0-5568-434b-ad61-805634c2beeb could not be found. {{(pid=71972) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10816}} Apr 17 22:24:09 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-78f24c86-99ca-4f42-8de2-a3dab641e3da tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:24:09 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-78f24c86-99ca-4f42-8de2-a3dab641e3da tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:24:09 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-78f24c86-99ca-4f42-8de2-a3dab641e3da tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:24:09 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-78f24c86-99ca-4f42-8de2-a3dab641e3da tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:24:09 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-78f24c86-99ca-4f42-8de2-a3dab641e3da tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.104s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:24:09 user nova-compute[71972]: INFO nova.scheduler.client.report [None req-78f24c86-99ca-4f42-8de2-a3dab641e3da tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Deleted allocations for instance ac56fdf0-5568-434b-ad61-805634c2beeb Apr 17 22:24:09 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-78f24c86-99ca-4f42-8de2-a3dab641e3da tempest-ServerBootFromVolumeStableRescueTest-480550513 tempest-ServerBootFromVolumeStableRescueTest-480550513-project-member] Lock "ac56fdf0-5568-434b-ad61-805634c2beeb" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.768s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:24:09 user nova-compute[71972]: DEBUG nova.compute.manager [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Received event network-vif-plugged-4acd5cd6-47e1-4450-8baa-092f9444c03f {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:24:09 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] Acquiring lock "ac56fdf0-5568-434b-ad61-805634c2beeb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:24:09 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] Lock "ac56fdf0-5568-434b-ad61-805634c2beeb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:24:09 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] Lock "ac56fdf0-5568-434b-ad61-805634c2beeb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:24:09 user nova-compute[71972]: DEBUG nova.compute.manager [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] No waiting events found dispatching network-vif-plugged-4acd5cd6-47e1-4450-8baa-092f9444c03f {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:24:09 user nova-compute[71972]: WARNING nova.compute.manager [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Received unexpected event network-vif-plugged-4acd5cd6-47e1-4450-8baa-092f9444c03f for instance with vm_state deleted and task_state None. Apr 17 22:24:09 user nova-compute[71972]: DEBUG nova.compute.manager [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Received event network-vif-plugged-4acd5cd6-47e1-4450-8baa-092f9444c03f {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:24:09 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] Acquiring lock "ac56fdf0-5568-434b-ad61-805634c2beeb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:24:09 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] Lock "ac56fdf0-5568-434b-ad61-805634c2beeb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:24:09 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] Lock "ac56fdf0-5568-434b-ad61-805634c2beeb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:24:09 user nova-compute[71972]: DEBUG nova.compute.manager [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] No waiting events found dispatching network-vif-plugged-4acd5cd6-47e1-4450-8baa-092f9444c03f {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:24:09 user nova-compute[71972]: WARNING nova.compute.manager [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Received unexpected event network-vif-plugged-4acd5cd6-47e1-4450-8baa-092f9444c03f for instance with vm_state deleted and task_state None. Apr 17 22:24:09 user nova-compute[71972]: DEBUG nova.compute.manager [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Received event network-vif-plugged-4acd5cd6-47e1-4450-8baa-092f9444c03f {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:24:09 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] Acquiring lock "ac56fdf0-5568-434b-ad61-805634c2beeb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:24:09 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] Lock "ac56fdf0-5568-434b-ad61-805634c2beeb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:24:09 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] Lock "ac56fdf0-5568-434b-ad61-805634c2beeb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:24:09 user nova-compute[71972]: DEBUG nova.compute.manager [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] No waiting events found dispatching network-vif-plugged-4acd5cd6-47e1-4450-8baa-092f9444c03f {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:24:09 user nova-compute[71972]: WARNING nova.compute.manager [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Received unexpected event network-vif-plugged-4acd5cd6-47e1-4450-8baa-092f9444c03f for instance with vm_state deleted and task_state None. Apr 17 22:24:09 user nova-compute[71972]: DEBUG nova.compute.manager [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Received event network-vif-unplugged-4acd5cd6-47e1-4450-8baa-092f9444c03f {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:24:09 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] Acquiring lock "ac56fdf0-5568-434b-ad61-805634c2beeb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:24:09 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] Lock "ac56fdf0-5568-434b-ad61-805634c2beeb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:24:09 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] Lock "ac56fdf0-5568-434b-ad61-805634c2beeb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:24:09 user nova-compute[71972]: DEBUG nova.compute.manager [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] No waiting events found dispatching network-vif-unplugged-4acd5cd6-47e1-4450-8baa-092f9444c03f {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:24:09 user nova-compute[71972]: WARNING nova.compute.manager [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Received unexpected event network-vif-unplugged-4acd5cd6-47e1-4450-8baa-092f9444c03f for instance with vm_state deleted and task_state None. Apr 17 22:24:09 user nova-compute[71972]: DEBUG nova.compute.manager [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Received event network-vif-plugged-4acd5cd6-47e1-4450-8baa-092f9444c03f {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:24:09 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] Acquiring lock "ac56fdf0-5568-434b-ad61-805634c2beeb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:24:09 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] Lock "ac56fdf0-5568-434b-ad61-805634c2beeb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:24:09 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] Lock "ac56fdf0-5568-434b-ad61-805634c2beeb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:24:09 user nova-compute[71972]: DEBUG nova.compute.manager [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] No waiting events found dispatching network-vif-plugged-4acd5cd6-47e1-4450-8baa-092f9444c03f {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:24:09 user nova-compute[71972]: WARNING nova.compute.manager [req-50daeeff-0283-402e-bdc4-14995fa8d6ec req-73ac42d8-1186-49ac-b791-2d4c8fd2e625 service nova] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Received unexpected event network-vif-plugged-4acd5cd6-47e1-4450-8baa-092f9444c03f for instance with vm_state deleted and task_state None. Apr 17 22:24:13 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:24:18 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:24:18 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:24:18 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:24:18 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:24:18 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:24:18 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:24:23 user nova-compute[71972]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:24:23 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] VM Stopped (Lifecycle Event) Apr 17 22:24:23 user nova-compute[71972]: DEBUG nova.compute.manager [None req-e78e0f07-44cf-4f39-a73f-ef09219a0b0d None None] [instance: ac56fdf0-5568-434b-ad61-805634c2beeb] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:24:23 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:24:28 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:24:33 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:24:33 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:24:33 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:24:33 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:24:33 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:24:33 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:24:38 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:24:43 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:24:43 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:24:43 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:24:43 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:24:43 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:24:43 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:24:47 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:24:47 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Starting heal instance info cache {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 22:24:47 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Rebuilding the list of instances to heal {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 17 22:24:47 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Didn't find any instances for network info cache update. {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 17 22:24:47 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:24:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:24:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:24:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:24:47 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Auditing locally available compute resources for user (node: user) {{(pid=71972) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 22:24:48 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:24:48 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:24:48 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Hypervisor/Node resource view: name=user free_ram=9200MB free_disk=26.610572814941406GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71972) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 22:24:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:24:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:24:48 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 22:24:48 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 22:24:48 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:24:48 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:24:48 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Compute_service record updated for user:user {{(pid=71972) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 22:24:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.155s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:24:48 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:24:48 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Cleaning up deleted instances with incomplete migration {{(pid=71972) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 17 22:24:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:24:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:24:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:24:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:24:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:24:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:24:49 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:24:49 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:24:49 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:24:50 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:24:51 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:24:52 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:24:52 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:24:52 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71972) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 22:24:53 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:24:54 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:24:56 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:24:58 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:24:59 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:00 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:03 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:05 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:07 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:07 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:25:07 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Cleaning up deleted instances {{(pid=71972) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 17 22:25:07 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] There are 0 instances to clean {{(pid=71972) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 17 22:25:07 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:08 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:10 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Acquiring lock "5eef06c2-982d-484a-9606-e8c5a2971850" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:25:10 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Lock "5eef06c2-982d-484a-9606-e8c5a2971850" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:25:10 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Starting instance... {{(pid=71972) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2404}} Apr 17 22:25:10 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:25:10 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:25:10 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Require both a host and instance NUMA topology to fit instance on host. {{(pid=71972) numa_fit_instance_to_host /opt/stack/nova/nova/virt/hardware.py:2338}} Apr 17 22:25:10 user nova-compute[71972]: INFO nova.compute.claims [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Claim successful on node user Apr 17 22:25:10 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:25:10 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:25:10 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.214s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:25:10 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Start building networks asynchronously for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2784}} Apr 17 22:25:11 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Allocating IP information in the background. {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1957}} Apr 17 22:25:11 user nova-compute[71972]: DEBUG nova.network.neutron [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] allocate_for_instance() {{(pid=71972) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1154}} Apr 17 22:25:11 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names Apr 17 22:25:11 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Start building block device mappings for instance. {{(pid=71972) _build_resources /opt/stack/nova/nova/compute/manager.py:2819}} Apr 17 22:25:11 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Start spawning the instance on the hypervisor. {{(pid=71972) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2604}} Apr 17 22:25:11 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Creating instance directory {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4698}} Apr 17 22:25:11 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Creating image(s) Apr 17 22:25:11 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Acquiring lock "/opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:25:11 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Lock "/opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:25:11 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Lock "/opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format..write_to_disk_info_file" :: held 0.002s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:25:11 user nova-compute[71972]: DEBUG nova.policy [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0a3f4e1adcbf46308beb0bd8e35a6dcd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c15f6fa11d3549ceb94c8abb06c78f7e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=71972) authorize /opt/stack/nova/nova/policy.py:203}} Apr 17 22:25:11 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:25:11 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.138s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:25:11 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Acquiring lock "fde46b8e739fd6213d1525690f3ccf27384ee720" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:25:11 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: waited 0.002s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:25:11 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:25:11 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.145s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:25:11 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/disk 1073741824 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:25:11 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/disk 1073741824" returned: 0 in 0.047s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:25:11 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Lock "fde46b8e739fd6213d1525690f3ccf27384ee720" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image..create_qcow2_image" :: held 0.199s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:25:11 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:25:11 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.135s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:25:11 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Checking if we can resize image /opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/disk. size=1073741824 {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:166}} Apr 17 22:25:11 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:25:11 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:25:11 user nova-compute[71972]: DEBUG nova.virt.disk.api [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Cannot resize image /opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/disk to a smaller size. {{(pid=71972) can_resize_image /opt/stack/nova/nova/virt/disk/api.py:172}} Apr 17 22:25:11 user nova-compute[71972]: DEBUG nova.objects.instance [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Lazy-loading 'migration_context' on Instance uuid 5eef06c2-982d-484a-9606-e8c5a2971850 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:25:11 user nova-compute[71972]: DEBUG nova.network.neutron [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Successfully created port: f9e56ee9-5e39-4871-a635-202dfa5ab8d1 {{(pid=71972) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:546}} Apr 17 22:25:11 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Created local disks {{(pid=71972) _create_image /opt/stack/nova/nova/virt/libvirt/driver.py:4832}} Apr 17 22:25:11 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Ensure instance console log exists: /opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/console.log {{(pid=71972) _ensure_console_log_for_instance /opt/stack/nova/nova/virt/libvirt/driver.py:4584}} Apr 17 22:25:11 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:25:11 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:25:11 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:25:12 user nova-compute[71972]: DEBUG nova.network.neutron [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Successfully updated port: f9e56ee9-5e39-4871-a635-202dfa5ab8d1 {{(pid=71972) _update_port /opt/stack/nova/nova/network/neutron.py:584}} Apr 17 22:25:12 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Acquiring lock "refresh_cache-5eef06c2-982d-484a-9606-e8c5a2971850" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:25:12 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Acquired lock "refresh_cache-5eef06c2-982d-484a-9606-e8c5a2971850" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:25:12 user nova-compute[71972]: DEBUG nova.network.neutron [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Building network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2000}} Apr 17 22:25:12 user nova-compute[71972]: DEBUG nova.network.neutron [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Instance cache missing network info. {{(pid=71972) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3313}} Apr 17 22:25:12 user nova-compute[71972]: DEBUG nova.compute.manager [req-8a69d75b-2061-4da9-aaee-6a1a0c59c584 req-bcbd3964-095f-44b3-8deb-a80f5dd5a6c3 service nova] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Received event network-changed-f9e56ee9-5e39-4871-a635-202dfa5ab8d1 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:25:12 user nova-compute[71972]: DEBUG nova.compute.manager [req-8a69d75b-2061-4da9-aaee-6a1a0c59c584 req-bcbd3964-095f-44b3-8deb-a80f5dd5a6c3 service nova] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Refreshing instance network info cache due to event network-changed-f9e56ee9-5e39-4871-a635-202dfa5ab8d1. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:25:12 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-8a69d75b-2061-4da9-aaee-6a1a0c59c584 req-bcbd3964-095f-44b3-8deb-a80f5dd5a6c3 service nova] Acquiring lock "refresh_cache-5eef06c2-982d-484a-9606-e8c5a2971850" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:25:12 user nova-compute[71972]: DEBUG nova.network.neutron [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Updating instance_info_cache with network_info: [{"id": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "address": "fa:16:3e:7f:4e:23", "network": {"id": "38a74fb3-65db-4af3-b3a0-81fb7ce3ba18", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-35556120-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c15f6fa11d3549ceb94c8abb06c78f7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9e56ee9-5e", "ovs_interfaceid": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:25:12 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:12 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Releasing lock "refresh_cache-5eef06c2-982d-484a-9606-e8c5a2971850" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:25:12 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Instance network_info: |[{"id": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "address": "fa:16:3e:7f:4e:23", "network": {"id": "38a74fb3-65db-4af3-b3a0-81fb7ce3ba18", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-35556120-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c15f6fa11d3549ceb94c8abb06c78f7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9e56ee9-5e", "ovs_interfaceid": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=71972) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1972}} Apr 17 22:25:12 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-8a69d75b-2061-4da9-aaee-6a1a0c59c584 req-bcbd3964-095f-44b3-8deb-a80f5dd5a6c3 service nova] Acquired lock "refresh_cache-5eef06c2-982d-484a-9606-e8c5a2971850" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:25:12 user nova-compute[71972]: DEBUG nova.network.neutron [req-8a69d75b-2061-4da9-aaee-6a1a0c59c584 req-bcbd3964-095f-44b3-8deb-a80f5dd5a6c3 service nova] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Refreshing network info cache for port f9e56ee9-5e39-4871-a635-202dfa5ab8d1 {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:25:12 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Start _get_guest_xml network_info=[{"id": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "address": "fa:16:3e:7f:4e:23", "network": {"id": "38a74fb3-65db-4af3-b3a0-81fb7ce3ba18", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-35556120-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c15f6fa11d3549ceb94c8abb06c78f7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9e56ee9-5e", "ovs_interfaceid": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'ide', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}}} image_meta=ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'encryption_format': None, 'device_type': 'disk', 'encryption_options': None, 'size': 0, 'image_id': '80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7526}} Apr 17 22:25:12 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:25:12 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:25:12 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' {{(pid=71972) _get_guest_cpu_model_config /opt/stack/nova/nova/virt/libvirt/driver.py:5371}} Apr 17 22:25:12 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Getting desirable topologies for flavor Flavor(created_at=2023-04-17T22:06:03Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='b874c39491a2377b8490f5f1e89761a4',container_format='bare',created_at=2023-04-17T22:04:59Z,direct_url=,disk_format='qcow2',id=80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f,min_disk=0,min_ram=0,name='cirros-0.5.2-x86_64-disk',owner='b754a14641ba4010b66bb8b470a31290',properties=ImageMetaProps,protected=,size=16300544,status='active',tags=,updated_at=2023-04-17T22:05:01Z,virtual_size=,visibility=), allow threads: True {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:558}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Flavor limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:343}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Image limits 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:347}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Flavor pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:383}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Image pref 0:0:0 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:387}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=71972) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:425}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:564}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:466}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Got 1 possible topologies {{(pid=71972) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:496}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:570}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG nova.virt.hardware [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=71972) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:572}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:25:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-732289986',display_name='tempest-ServerStableDeviceRescueTest-server-732289986',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverstabledevicerescuetest-server-732289986',id=25,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ77DUwS1YdOE+AgOW8F2OmFPS7mgWiVKzl7CpNcpYyYaE0PBWLchyK90gX9HM6IuyWgqbEzV4S7I/iswh9WIKISaxrVni8gvtqHbgvv2mKDgVJIiC0uNr4C//sH7z9EdA==',key_name='tempest-keypair-1108392394',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c15f6fa11d3549ceb94c8abb06c78f7e',ramdisk_id='',reservation_id='r-dhdwdd7p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1898719742',owner_user_name='tempest-ServerStableDeviceRescueTest-1898719742-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:25:11Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0a3f4e1adcbf46308beb0bd8e35a6dcd',uuid=5eef06c2-982d-484a-9606-e8c5a2971850,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "address": "fa:16:3e:7f:4e:23", "network": {"id": "38a74fb3-65db-4af3-b3a0-81fb7ce3ba18", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-35556120-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c15f6fa11d3549ceb94c8abb06c78f7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9e56ee9-5e", "ovs_interfaceid": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm {{(pid=71972) get_config /opt/stack/nova/nova/virt/libvirt/vif.py:563}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Converting VIF {"id": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "address": "fa:16:3e:7f:4e:23", "network": {"id": "38a74fb3-65db-4af3-b3a0-81fb7ce3ba18", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-35556120-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c15f6fa11d3549ceb94c8abb06c78f7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9e56ee9-5e", "ovs_interfaceid": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:4e:23,bridge_name='br-int',has_traffic_filtering=True,id=f9e56ee9-5e39-4871-a635-202dfa5ab8d1,network=Network(38a74fb3-65db-4af3-b3a0-81fb7ce3ba18),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9e56ee9-5e') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG nova.objects.instance [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Lazy-loading 'pci_devices' on Instance uuid 5eef06c2-982d-484a-9606-e8c5a2971850 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] End _get_guest_xml xml= Apr 17 22:25:13 user nova-compute[71972]: 5eef06c2-982d-484a-9606-e8c5a2971850 Apr 17 22:25:13 user nova-compute[71972]: instance-00000019 Apr 17 22:25:13 user nova-compute[71972]: 131072 Apr 17 22:25:13 user nova-compute[71972]: 1 Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: tempest-ServerStableDeviceRescueTest-server-732289986 Apr 17 22:25:13 user nova-compute[71972]: 2023-04-17 22:25:12 Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: 128 Apr 17 22:25:13 user nova-compute[71972]: 1 Apr 17 22:25:13 user nova-compute[71972]: 0 Apr 17 22:25:13 user nova-compute[71972]: 0 Apr 17 22:25:13 user nova-compute[71972]: 1 Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: tempest-ServerStableDeviceRescueTest-1898719742-project-member Apr 17 22:25:13 user nova-compute[71972]: tempest-ServerStableDeviceRescueTest-1898719742 Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: OpenStack Foundation Apr 17 22:25:13 user nova-compute[71972]: OpenStack Nova Apr 17 22:25:13 user nova-compute[71972]: 0.0.0 Apr 17 22:25:13 user nova-compute[71972]: 5eef06c2-982d-484a-9606-e8c5a2971850 Apr 17 22:25:13 user nova-compute[71972]: 5eef06c2-982d-484a-9606-e8c5a2971850 Apr 17 22:25:13 user nova-compute[71972]: Virtual Machine Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: hvm Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Nehalem Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: /dev/urandom Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: Apr 17 22:25:13 user nova-compute[71972]: {{(pid=71972) _get_guest_xml /opt/stack/nova/nova/virt/libvirt/driver.py:7532}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:25:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-732289986',display_name='tempest-ServerStableDeviceRescueTest-server-732289986',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverstabledevicerescuetest-server-732289986',id=25,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ77DUwS1YdOE+AgOW8F2OmFPS7mgWiVKzl7CpNcpYyYaE0PBWLchyK90gX9HM6IuyWgqbEzV4S7I/iswh9WIKISaxrVni8gvtqHbgvv2mKDgVJIiC0uNr4C//sH7z9EdA==',key_name='tempest-keypair-1108392394',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c15f6fa11d3549ceb94c8abb06c78f7e',ramdisk_id='',reservation_id='r-dhdwdd7p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='pc',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1898719742',owner_user_name='tempest-ServerStableDeviceRescueTest-1898719742-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2023-04-17T22:25:11Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0a3f4e1adcbf46308beb0bd8e35a6dcd',uuid=5eef06c2-982d-484a-9606-e8c5a2971850,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "address": "fa:16:3e:7f:4e:23", "network": {"id": "38a74fb3-65db-4af3-b3a0-81fb7ce3ba18", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-35556120-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c15f6fa11d3549ceb94c8abb06c78f7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9e56ee9-5e", "ovs_interfaceid": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) plug /opt/stack/nova/nova/virt/libvirt/vif.py:710}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Converting VIF {"id": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "address": "fa:16:3e:7f:4e:23", "network": {"id": "38a74fb3-65db-4af3-b3a0-81fb7ce3ba18", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-35556120-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c15f6fa11d3549ceb94c8abb06c78f7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9e56ee9-5e", "ovs_interfaceid": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:4e:23,bridge_name='br-int',has_traffic_filtering=True,id=f9e56ee9-5e39-4871-a635-202dfa5ab8d1,network=Network(38a74fb3-65db-4af3-b3a0-81fb7ce3ba18),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9e56ee9-5e') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG os_vif [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:4e:23,bridge_name='br-int',has_traffic_filtering=True,id=f9e56ee9-5e39-4871-a635-202dfa5ab8d1,network=Network(38a74fb3-65db-4af3-b3a0-81fb7ce3ba18),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9e56ee9-5e') {{(pid=71972) plug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:76}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:129}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9e56ee9-5e, may_exist=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf9e56ee9-5e, col_values=(('external_ids', {'iface-id': 'f9e56ee9-5e39-4871-a635-202dfa5ab8d1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7f:4e:23', 'vm-uuid': '5eef06c2-982d-484a-9606-e8c5a2971850'}),)) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:13 user nova-compute[71972]: INFO os_vif [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:4e:23,bridge_name='br-int',has_traffic_filtering=True,id=f9e56ee9-5e39-4871-a635-202dfa5ab8d1,network=Network(38a74fb3-65db-4af3-b3a0-81fb7ce3ba18),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9e56ee9-5e') Apr 17 22:25:13 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] No BDM found with device name vda, not building metadata. {{(pid=71972) _build_disk_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12065}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] No VIF found with MAC fa:16:3e:7f:4e:23, not building metadata {{(pid=71972) _build_interface_metadata /opt/stack/nova/nova/virt/libvirt/driver.py:12041}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG nova.network.neutron [req-8a69d75b-2061-4da9-aaee-6a1a0c59c584 req-bcbd3964-095f-44b3-8deb-a80f5dd5a6c3 service nova] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Updated VIF entry in instance network info cache for port f9e56ee9-5e39-4871-a635-202dfa5ab8d1. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG nova.network.neutron [req-8a69d75b-2061-4da9-aaee-6a1a0c59c584 req-bcbd3964-095f-44b3-8deb-a80f5dd5a6c3 service nova] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Updating instance_info_cache with network_info: [{"id": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "address": "fa:16:3e:7f:4e:23", "network": {"id": "38a74fb3-65db-4af3-b3a0-81fb7ce3ba18", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-35556120-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c15f6fa11d3549ceb94c8abb06c78f7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9e56ee9-5e", "ovs_interfaceid": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:25:13 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-8a69d75b-2061-4da9-aaee-6a1a0c59c584 req-bcbd3964-095f-44b3-8deb-a80f5dd5a6c3 service nova] Releasing lock "refresh_cache-5eef06c2-982d-484a-9606-e8c5a2971850" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:25:14 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:14 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:14 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:14 user nova-compute[71972]: DEBUG nova.compute.manager [req-d9a183d6-3e2b-486d-ba17-4490c1bb1491 req-dc1d79b8-99fc-44b8-958e-35aba8877a90 service nova] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Received event network-vif-plugged-f9e56ee9-5e39-4871-a635-202dfa5ab8d1 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:25:14 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-d9a183d6-3e2b-486d-ba17-4490c1bb1491 req-dc1d79b8-99fc-44b8-958e-35aba8877a90 service nova] Acquiring lock "5eef06c2-982d-484a-9606-e8c5a2971850-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:25:14 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-d9a183d6-3e2b-486d-ba17-4490c1bb1491 req-dc1d79b8-99fc-44b8-958e-35aba8877a90 service nova] Lock "5eef06c2-982d-484a-9606-e8c5a2971850-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:25:14 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-d9a183d6-3e2b-486d-ba17-4490c1bb1491 req-dc1d79b8-99fc-44b8-958e-35aba8877a90 service nova] Lock "5eef06c2-982d-484a-9606-e8c5a2971850-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:25:14 user nova-compute[71972]: DEBUG nova.compute.manager [req-d9a183d6-3e2b-486d-ba17-4490c1bb1491 req-dc1d79b8-99fc-44b8-958e-35aba8877a90 service nova] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] No waiting events found dispatching network-vif-plugged-f9e56ee9-5e39-4871-a635-202dfa5ab8d1 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:25:14 user nova-compute[71972]: WARNING nova.compute.manager [req-d9a183d6-3e2b-486d-ba17-4490c1bb1491 req-dc1d79b8-99fc-44b8-958e-35aba8877a90 service nova] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Received unexpected event network-vif-plugged-f9e56ee9-5e39-4871-a635-202dfa5ab8d1 for instance with vm_state building and task_state spawning. Apr 17 22:25:14 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:14 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:14 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:16 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Resumed> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:25:16 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] VM Resumed (Lifecycle Event) Apr 17 22:25:16 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Instance event wait completed in 0 seconds for {{(pid=71972) wait_for_instance_event /opt/stack/nova/nova/compute/manager.py:577}} Apr 17 22:25:16 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Guest created on hypervisor {{(pid=71972) spawn /opt/stack/nova/nova/virt/libvirt/driver.py:4392}} Apr 17 22:25:16 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Instance spawned successfully. Apr 17 22:25:16 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:889}} Apr 17 22:25:16 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:25:16 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:25:16 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Found default for hw_cdrom_bus of ide {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:25:16 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Found default for hw_disk_bus of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:25:16 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Found default for hw_input_bus of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:25:16 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Found default for hw_pointer_model of None {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:25:16 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Found default for hw_video_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:25:16 user nova-compute[71972]: DEBUG nova.virt.libvirt.driver [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Found default for hw_vif_model of virtio {{(pid=71972) _register_undefined_instance_details /opt/stack/nova/nova/virt/libvirt/driver.py:918}} Apr 17 22:25:16 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:25:16 user nova-compute[71972]: DEBUG nova.virt.driver [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] Emitting event Started> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:25:16 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] VM Started (Lifecycle Event) Apr 17 22:25:16 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:25:16 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 {{(pid=71972) handle_lifecycle_event /opt/stack/nova/nova/compute/manager.py:1396}} Apr 17 22:25:16 user nova-compute[71972]: INFO nova.compute.manager [None req-2bd53e85-42bf-46c6-bff9-bd1499a1a0ef None None] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] During sync_power_state the instance has a pending task (spawning). Skip. Apr 17 22:25:16 user nova-compute[71972]: INFO nova.compute.manager [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Took 5.38 seconds to spawn the instance on the hypervisor. Apr 17 22:25:16 user nova-compute[71972]: DEBUG nova.compute.manager [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:25:16 user nova-compute[71972]: INFO nova.compute.manager [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Took 5.90 seconds to build instance. Apr 17 22:25:16 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-2697ed71-d1f3-4d44-b389-abe1a9a5769c tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Lock "5eef06c2-982d-484a-9606-e8c5a2971850" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 6.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:25:16 user nova-compute[71972]: DEBUG nova.compute.manager [req-42bb5dd2-dc85-40f6-b2ba-cdda514480a5 req-e0d59a7c-7a6e-4b58-88ce-30dee5753601 service nova] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Received event network-vif-plugged-f9e56ee9-5e39-4871-a635-202dfa5ab8d1 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:25:16 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-42bb5dd2-dc85-40f6-b2ba-cdda514480a5 req-e0d59a7c-7a6e-4b58-88ce-30dee5753601 service nova] Acquiring lock "5eef06c2-982d-484a-9606-e8c5a2971850-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:25:16 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-42bb5dd2-dc85-40f6-b2ba-cdda514480a5 req-e0d59a7c-7a6e-4b58-88ce-30dee5753601 service nova] Lock "5eef06c2-982d-484a-9606-e8c5a2971850-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:25:16 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-42bb5dd2-dc85-40f6-b2ba-cdda514480a5 req-e0d59a7c-7a6e-4b58-88ce-30dee5753601 service nova] Lock "5eef06c2-982d-484a-9606-e8c5a2971850-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:25:16 user nova-compute[71972]: DEBUG nova.compute.manager [req-42bb5dd2-dc85-40f6-b2ba-cdda514480a5 req-e0d59a7c-7a6e-4b58-88ce-30dee5753601 service nova] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] No waiting events found dispatching network-vif-plugged-f9e56ee9-5e39-4871-a635-202dfa5ab8d1 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:25:16 user nova-compute[71972]: WARNING nova.compute.manager [req-42bb5dd2-dc85-40f6-b2ba-cdda514480a5 req-e0d59a7c-7a6e-4b58-88ce-30dee5753601 service nova] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Received unexpected event network-vif-plugged-f9e56ee9-5e39-4871-a635-202dfa5ab8d1 for instance with vm_state active and task_state None. Apr 17 22:25:17 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:18 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:22 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:23 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:27 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:28 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:28 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._sync_power_states {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:25:28 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Triggering sync for uuid 5eef06c2-982d-484a-9606-e8c5a2971850 {{(pid=71972) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10202}} Apr 17 22:25:28 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "5eef06c2-982d-484a-9606-e8c5a2971850" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:25:28 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "5eef06c2-982d-484a-9606-e8c5a2971850" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:25:28 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "5eef06c2-982d-484a-9606-e8c5a2971850" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.022s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:25:32 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:33 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:37 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:38 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:42 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:43 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:47 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:25:47 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Starting heal instance info cache {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 22:25:47 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Rebuilding the list of instances to heal {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 17 22:25:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "refresh_cache-5eef06c2-982d-484a-9606-e8c5a2971850" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:25:47 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquired lock "refresh_cache-5eef06c2-982d-484a-9606-e8c5a2971850" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:25:47 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Forcefully refreshing network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 17 22:25:47 user nova-compute[71972]: DEBUG nova.objects.instance [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lazy-loading 'info_cache' on Instance uuid 5eef06c2-982d-484a-9606-e8c5a2971850 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:25:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:25:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:25:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:25:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:25:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:48 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Updating instance_info_cache with network_info: [{"id": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "address": "fa:16:3e:7f:4e:23", "network": {"id": "38a74fb3-65db-4af3-b3a0-81fb7ce3ba18", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-35556120-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c15f6fa11d3549ceb94c8abb06c78f7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9e56ee9-5e", "ovs_interfaceid": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:25:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Releasing lock "refresh_cache-5eef06c2-982d-484a-9606-e8c5a2971850" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:25:48 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Updated the network info_cache for instance {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 17 22:25:48 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:25:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:25:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:25:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:25:48 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Auditing locally available compute resources for user (node: user) {{(pid=71972) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 22:25:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:25:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/disk --force-share --output=json" returned: 0 in 0.139s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:25:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:25:49 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/disk --force-share --output=json" returned: 0 in 0.130s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:25:49 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:25:49 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:25:49 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Hypervisor/Node resource view: name=user free_ram=9074MB free_disk=26.58843231201172GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71972) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 22:25:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:25:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:25:49 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 5eef06c2-982d-484a-9606-e8c5a2971850 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:25:49 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 22:25:49 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 22:25:49 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Refreshing inventories for resource provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} Apr 17 22:25:49 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Updating ProviderTree inventory for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e from _refresh_and_get_inventory using data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} Apr 17 22:25:49 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Updating inventory in ProviderTree for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e with inventory: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} Apr 17 22:25:49 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Refreshing aggregate associations for resource provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e, aggregates: None {{(pid=71972) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} Apr 17 22:25:49 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Refreshing trait associations for resource provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e, traits: HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_RESCUE_BFV,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VMVGA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NODE,HW_CPU_X86_SSE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_QXL,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE2,COMPUTE_VOLUME_MULTI_ATTACH {{(pid=71972) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} Apr 17 22:25:49 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:25:49 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:25:49 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Compute_service record updated for user:user {{(pid=71972) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 22:25:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.407s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:25:51 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:25:51 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:25:51 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:25:52 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:25:52 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:25:52 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71972) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 22:25:53 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:25:53 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:25:56 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:25:58 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:26:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:26:03 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:26:08 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:26:13 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:26:18 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:26:18 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:26:18 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:26:18 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:26:18 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:26:18 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:26:23 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:26:28 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:26:28 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:26:28 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:26:28 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:26:28 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:26:28 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:26:33 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:26:37 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:26:38 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:26:43 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:26:43 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:26:43 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:26:43 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:26:43 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:26:43 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:26:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:26:48 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:26:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:26:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:26:48 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:26:48 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Auditing locally available compute resources for user (node: user) {{(pid=71972) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 22:26:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:26:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/disk --force-share --output=json" returned: 0 in 0.134s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:26:48 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:26:49 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/disk --force-share --output=json" returned: 0 in 0.136s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:26:49 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:26:49 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:26:49 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Hypervisor/Node resource view: name=user free_ram=9102MB free_disk=26.587867736816406GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71972) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 22:26:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:26:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:26:49 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 5eef06c2-982d-484a-9606-e8c5a2971850 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:26:49 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 22:26:49 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 22:26:49 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:26:49 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:26:49 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Compute_service record updated for user:user {{(pid=71972) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 22:26:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.160s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:26:50 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:26:50 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Starting heal instance info cache {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 22:26:50 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Rebuilding the list of instances to heal {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 17 22:26:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "refresh_cache-5eef06c2-982d-484a-9606-e8c5a2971850" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:26:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquired lock "refresh_cache-5eef06c2-982d-484a-9606-e8c5a2971850" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:26:50 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Forcefully refreshing network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 17 22:26:50 user nova-compute[71972]: DEBUG nova.objects.instance [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lazy-loading 'info_cache' on Instance uuid 5eef06c2-982d-484a-9606-e8c5a2971850 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:26:51 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Updating instance_info_cache with network_info: [{"id": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "address": "fa:16:3e:7f:4e:23", "network": {"id": "38a74fb3-65db-4af3-b3a0-81fb7ce3ba18", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-35556120-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c15f6fa11d3549ceb94c8abb06c78f7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9e56ee9-5e", "ovs_interfaceid": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:26:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Releasing lock "refresh_cache-5eef06c2-982d-484a-9606-e8c5a2971850" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:26:51 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Updated the network info_cache for instance {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 17 22:26:51 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:26:52 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:26:52 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:26:53 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:26:53 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:26:53 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:26:53 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71972) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 22:26:54 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:26:56 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:26:56 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:26:58 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:27:01 user nova-compute[71972]: DEBUG nova.compute.manager [req-4934ae74-b0de-4a24-839d-4560e28c6fa4 req-3e541e3c-70e2-4297-988f-455b9aa8570a service nova] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Received event network-changed-f9e56ee9-5e39-4871-a635-202dfa5ab8d1 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:27:01 user nova-compute[71972]: DEBUG nova.compute.manager [req-4934ae74-b0de-4a24-839d-4560e28c6fa4 req-3e541e3c-70e2-4297-988f-455b9aa8570a service nova] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Refreshing instance network info cache due to event network-changed-f9e56ee9-5e39-4871-a635-202dfa5ab8d1. {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10987}} Apr 17 22:27:01 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-4934ae74-b0de-4a24-839d-4560e28c6fa4 req-3e541e3c-70e2-4297-988f-455b9aa8570a service nova] Acquiring lock "refresh_cache-5eef06c2-982d-484a-9606-e8c5a2971850" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:27:01 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-4934ae74-b0de-4a24-839d-4560e28c6fa4 req-3e541e3c-70e2-4297-988f-455b9aa8570a service nova] Acquired lock "refresh_cache-5eef06c2-982d-484a-9606-e8c5a2971850" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:27:01 user nova-compute[71972]: DEBUG nova.network.neutron [req-4934ae74-b0de-4a24-839d-4560e28c6fa4 req-3e541e3c-70e2-4297-988f-455b9aa8570a service nova] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Refreshing network info cache for port f9e56ee9-5e39-4871-a635-202dfa5ab8d1 {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1997}} Apr 17 22:27:02 user nova-compute[71972]: DEBUG nova.network.neutron [req-4934ae74-b0de-4a24-839d-4560e28c6fa4 req-3e541e3c-70e2-4297-988f-455b9aa8570a service nova] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Updated VIF entry in instance network info cache for port f9e56ee9-5e39-4871-a635-202dfa5ab8d1. {{(pid=71972) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3472}} Apr 17 22:27:02 user nova-compute[71972]: DEBUG nova.network.neutron [req-4934ae74-b0de-4a24-839d-4560e28c6fa4 req-3e541e3c-70e2-4297-988f-455b9aa8570a service nova] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Updating instance_info_cache with network_info: [{"id": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "address": "fa:16:3e:7f:4e:23", "network": {"id": "38a74fb3-65db-4af3-b3a0-81fb7ce3ba18", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-35556120-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.126", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c15f6fa11d3549ceb94c8abb06c78f7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9e56ee9-5e", "ovs_interfaceid": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:27:02 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-4934ae74-b0de-4a24-839d-4560e28c6fa4 req-3e541e3c-70e2-4297-988f-455b9aa8570a service nova] Releasing lock "refresh_cache-5eef06c2-982d-484a-9606-e8c5a2971850" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:27:02 user nova-compute[71972]: DEBUG nova.compute.manager [None req-7de26976-b2a7-43a7-90d1-8d5f219e3ed5 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:27:02 user nova-compute[71972]: INFO nova.compute.manager [None req-7de26976-b2a7-43a7-90d1-8d5f219e3ed5 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] instance snapshotting Apr 17 22:27:02 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-7de26976-b2a7-43a7-90d1-8d5f219e3ed5 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Beginning live snapshot process Apr 17 22:27:02 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-7de26976-b2a7-43a7-90d1-8d5f219e3ed5 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/disk --force-share --output=json -f qcow2 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:27:02 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:27:03 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-7de26976-b2a7-43a7-90d1-8d5f219e3ed5 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/disk --force-share --output=json -f qcow2" returned: 0 in 0.137s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:27:03 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-7de26976-b2a7-43a7-90d1-8d5f219e3ed5 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/disk --force-share --output=json -f qcow2 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:27:03 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:27:03 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-7de26976-b2a7-43a7-90d1-8d5f219e3ed5 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/disk --force-share --output=json -f qcow2" returned: 0 in 0.136s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:27:03 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-7de26976-b2a7-43a7-90d1-8d5f219e3ed5 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:27:03 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-7de26976-b2a7-43a7-90d1-8d5f219e3ed5 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720 --force-share --output=json" returned: 0 in 0.132s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:27:03 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-7de26976-b2a7-43a7-90d1-8d5f219e3ed5 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmp3ovweh8m/39598d48cd32471b8581bbc36614d19e.delta 1073741824 {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:27:03 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-7de26976-b2a7-43a7-90d1-8d5f219e3ed5 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/opt/stack/data/nova/instances/_base/fde46b8e739fd6213d1525690f3ccf27384ee720,backing_fmt=raw /opt/stack/data/nova/instances/snapshots/tmp3ovweh8m/39598d48cd32471b8581bbc36614d19e.delta 1073741824" returned: 0 in 0.048s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:27:03 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-7de26976-b2a7-43a7-90d1-8d5f219e3ed5 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Quiescing instance not available: QEMU guest agent is not enabled. Apr 17 22:27:04 user nova-compute[71972]: DEBUG nova.virt.libvirt.guest [None req-7de26976-b2a7-43a7-90d1-8d5f219e3ed5 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] COPY block job progress, current cursor: 131072 final cursor: 43778048 {{(pid=71972) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 17 22:27:04 user nova-compute[71972]: DEBUG nova.virt.libvirt.guest [None req-7de26976-b2a7-43a7-90d1-8d5f219e3ed5 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] COPY block job progress, current cursor: 43778048 final cursor: 43778048 {{(pid=71972) is_job_complete /opt/stack/nova/nova/virt/libvirt/guest.py:846}} Apr 17 22:27:04 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-7de26976-b2a7-43a7-90d1-8d5f219e3ed5 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Skipping quiescing instance: QEMU guest agent is not enabled. Apr 17 22:27:04 user nova-compute[71972]: DEBUG nova.privsep.utils [None req-7de26976-b2a7-43a7-90d1-8d5f219e3ed5 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Path '/opt/stack/data/nova/instances' supports direct I/O {{(pid=71972) supports_direct_io /opt/stack/nova/nova/privsep/utils.py:63}} Apr 17 22:27:04 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-7de26976-b2a7-43a7-90d1-8d5f219e3ed5 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmp3ovweh8m/39598d48cd32471b8581bbc36614d19e.delta /opt/stack/data/nova/instances/snapshots/tmp3ovweh8m/39598d48cd32471b8581bbc36614d19e {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:27:05 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-7de26976-b2a7-43a7-90d1-8d5f219e3ed5 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /opt/stack/data/nova/instances/snapshots/tmp3ovweh8m/39598d48cd32471b8581bbc36614d19e.delta /opt/stack/data/nova/instances/snapshots/tmp3ovweh8m/39598d48cd32471b8581bbc36614d19e" returned: 0 in 0.385s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:27:05 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-7de26976-b2a7-43a7-90d1-8d5f219e3ed5 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Snapshot extracted, beginning image upload Apr 17 22:27:07 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-7de26976-b2a7-43a7-90d1-8d5f219e3ed5 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Snapshot image upload complete Apr 17 22:27:07 user nova-compute[71972]: INFO nova.compute.manager [None req-7de26976-b2a7-43a7-90d1-8d5f219e3ed5 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Took 4.50 seconds to snapshot the instance on the hypervisor. Apr 17 22:27:07 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:27:08 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:27:12 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:27:13 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:27:18 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:27:18 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:27:23 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:27:23 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:27:23 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:27:23 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:27:23 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:27:23 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:27:28 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:27:28 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:27:28 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:27:28 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:27:28 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:27:28 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:27:33 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:27:33 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:27:38 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:27:38 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:27:43 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:27:43 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:27:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:27:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:27:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:27:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:27:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:27:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:27:49 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:27:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:27:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:27:49 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:27:49 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Auditing locally available compute resources for user (node: user) {{(pid=71972) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 22:27:49 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:27:49 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:27:49 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:27:50 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/disk --force-share --output=json" returned: 0 in 0.132s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:27:50 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:27:50 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:27:50 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Hypervisor/Node resource view: name=user free_ram=9098MB free_disk=26.551044464111328GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71972) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 22:27:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:27:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:27:50 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 5eef06c2-982d-484a-9606-e8c5a2971850 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:27:50 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 22:27:50 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 22:27:50 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:27:50 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:27:50 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Compute_service record updated for user:user {{(pid=71972) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 22:27:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.183s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:27:51 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:27:51 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Starting heal instance info cache {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 22:27:51 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Rebuilding the list of instances to heal {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 17 22:27:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "refresh_cache-5eef06c2-982d-484a-9606-e8c5a2971850" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:27:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquired lock "refresh_cache-5eef06c2-982d-484a-9606-e8c5a2971850" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:27:51 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Forcefully refreshing network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 17 22:27:51 user nova-compute[71972]: DEBUG nova.objects.instance [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lazy-loading 'info_cache' on Instance uuid 5eef06c2-982d-484a-9606-e8c5a2971850 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:27:52 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Updating instance_info_cache with network_info: [{"id": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "address": "fa:16:3e:7f:4e:23", "network": {"id": "38a74fb3-65db-4af3-b3a0-81fb7ce3ba18", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-35556120-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.126", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c15f6fa11d3549ceb94c8abb06c78f7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9e56ee9-5e", "ovs_interfaceid": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:27:52 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Releasing lock "refresh_cache-5eef06c2-982d-484a-9606-e8c5a2971850" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:27:52 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Updated the network info_cache for instance {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 17 22:27:52 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:27:53 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:27:53 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:27:53 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:27:54 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:27:54 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71972) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 22:27:55 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:27:56 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:27:56 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:27:58 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:27:58 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:27:58 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:27:58 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:27:58 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:27:58 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:28:03 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:28:03 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:28:03 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:28:03 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:28:03 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:28:03 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:28:08 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:28:13 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:28:13 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:28:13 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:28:13 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:28:13 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:28:13 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:28:18 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:28:18 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:28:18 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:28:18 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:28:18 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:28:18 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:28:23 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:28:28 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:28:33 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:28:33 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:28:33 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:28:33 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:28:33 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:28:33 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:28:38 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:28:43 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:28:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:28:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:28:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:28:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:28:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:28:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:28:50 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:28:50 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Starting heal instance info cache {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 22:28:50 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Rebuilding the list of instances to heal {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 17 22:28:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "refresh_cache-5eef06c2-982d-484a-9606-e8c5a2971850" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} Apr 17 22:28:50 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquired lock "refresh_cache-5eef06c2-982d-484a-9606-e8c5a2971850" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} Apr 17 22:28:50 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Forcefully refreshing network info cache for instance {{(pid=71972) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:1994}} Apr 17 22:28:50 user nova-compute[71972]: DEBUG nova.objects.instance [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lazy-loading 'info_cache' on Instance uuid 5eef06c2-982d-484a-9606-e8c5a2971850 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:28:51 user nova-compute[71972]: DEBUG nova.network.neutron [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Updating instance_info_cache with network_info: [{"id": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "address": "fa:16:3e:7f:4e:23", "network": {"id": "38a74fb3-65db-4af3-b3a0-81fb7ce3ba18", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-35556120-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.126", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c15f6fa11d3549ceb94c8abb06c78f7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9e56ee9-5e", "ovs_interfaceid": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:28:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Releasing lock "refresh_cache-5eef06c2-982d-484a-9606-e8c5a2971850" {{(pid=71972) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} Apr 17 22:28:51 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Updated the network info_cache for instance {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9863}} Apr 17 22:28:51 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:28:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:28:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:28:51 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:28:51 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Auditing locally available compute resources for user (node: user) {{(pid=71972) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 22:28:51 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:28:51 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/disk --force-share --output=json" returned: 0 in 0.143s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:28:51 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running cmd (subprocess): /usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/disk --force-share --output=json {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} Apr 17 22:28:52 user nova-compute[71972]: DEBUG oslo_concurrency.processutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CMD "/usr/bin/python3.10 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850/disk --force-share --output=json" returned: 0 in 0.133s {{(pid=71972) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} Apr 17 22:28:52 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:28:52 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:28:52 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Hypervisor/Node resource view: name=user free_ram=9120MB free_disk=26.55044174194336GB free_vcpus=11 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71972) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 22:28:52 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:28:52 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:28:52 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Instance 5eef06c2-982d-484a-9606-e8c5a2971850 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=71972) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1632}} Apr 17 22:28:52 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Total usable vcpus: 12, total allocated vcpus: 1 {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 22:28:52 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Final resource view: name=user phys_ram=16023MB used_ram=640MB phys_disk=40GB used_disk=1GB total_vcpus=12 used_vcpus=1 pci_stats=[] {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 22:28:52 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:28:52 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:28:52 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Compute_service record updated for user:user {{(pid=71972) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 22:28:52 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.169s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:28:53 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:28:54 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:28:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f2cc0676-7752-4baf-a4e0-378b8cdd0ef4 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Acquiring lock "5eef06c2-982d-484a-9606-e8c5a2971850" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:28:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f2cc0676-7752-4baf-a4e0-378b8cdd0ef4 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Lock "5eef06c2-982d-484a-9606-e8c5a2971850" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:28:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f2cc0676-7752-4baf-a4e0-378b8cdd0ef4 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Acquiring lock "5eef06c2-982d-484a-9606-e8c5a2971850-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:28:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f2cc0676-7752-4baf-a4e0-378b8cdd0ef4 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Lock "5eef06c2-982d-484a-9606-e8c5a2971850-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:28:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f2cc0676-7752-4baf-a4e0-378b8cdd0ef4 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Lock "5eef06c2-982d-484a-9606-e8c5a2971850-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:28:55 user nova-compute[71972]: INFO nova.compute.manager [None req-f2cc0676-7752-4baf-a4e0-378b8cdd0ef4 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Terminating instance Apr 17 22:28:55 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f2cc0676-7752-4baf-a4e0-378b8cdd0ef4 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Start destroying the instance on the hypervisor. {{(pid=71972) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3105}} Apr 17 22:28:55 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:28:55 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:28:55 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:28:55 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:28:55 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:28:55 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:28:55 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71972) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 22:28:55 user nova-compute[71972]: DEBUG nova.compute.manager [req-2a76a758-a61f-4859-a870-40daf420c43a req-6aaace8e-99bb-4b78-aab5-cb046cb17574 service nova] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Received event network-vif-unplugged-f9e56ee9-5e39-4871-a635-202dfa5ab8d1 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:28:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-2a76a758-a61f-4859-a870-40daf420c43a req-6aaace8e-99bb-4b78-aab5-cb046cb17574 service nova] Acquiring lock "5eef06c2-982d-484a-9606-e8c5a2971850-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:28:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-2a76a758-a61f-4859-a870-40daf420c43a req-6aaace8e-99bb-4b78-aab5-cb046cb17574 service nova] Lock "5eef06c2-982d-484a-9606-e8c5a2971850-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:28:55 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-2a76a758-a61f-4859-a870-40daf420c43a req-6aaace8e-99bb-4b78-aab5-cb046cb17574 service nova] Lock "5eef06c2-982d-484a-9606-e8c5a2971850-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:28:55 user nova-compute[71972]: DEBUG nova.compute.manager [req-2a76a758-a61f-4859-a870-40daf420c43a req-6aaace8e-99bb-4b78-aab5-cb046cb17574 service nova] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] No waiting events found dispatching network-vif-unplugged-f9e56ee9-5e39-4871-a635-202dfa5ab8d1 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:28:55 user nova-compute[71972]: DEBUG nova.compute.manager [req-2a76a758-a61f-4859-a870-40daf420c43a req-6aaace8e-99bb-4b78-aab5-cb046cb17574 service nova] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Received event network-vif-unplugged-f9e56ee9-5e39-4871-a635-202dfa5ab8d1 for instance with task_state deleting. {{(pid=71972) _process_instance_event /opt/stack/nova/nova/compute/manager.py:10760}} Apr 17 22:28:55 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:28:56 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:28:56 user nova-compute[71972]: INFO nova.virt.libvirt.driver [-] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Instance destroyed successfully. Apr 17 22:28:56 user nova-compute[71972]: DEBUG nova.objects.instance [None req-f2cc0676-7752-4baf-a4e0-378b8cdd0ef4 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Lazy-loading 'resources' on Instance uuid 5eef06c2-982d-484a-9606-e8c5a2971850 {{(pid=71972) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1100}} Apr 17 22:28:56 user nova-compute[71972]: DEBUG nova.virt.libvirt.vif [None req-f2cc0676-7752-4baf-a4e0-378b8cdd0ef4 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2023-04-17T22:25:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-732289986',display_name='tempest-ServerStableDeviceRescueTest-server-732289986',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(11),hidden=False,host='user',hostname='tempest-serverstabledevicerescuetest-server-732289986',id=25,image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',info_cache=InstanceInfoCache,instance_type_id=11,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ77DUwS1YdOE+AgOW8F2OmFPS7mgWiVKzl7CpNcpYyYaE0PBWLchyK90gX9HM6IuyWgqbEzV4S7I/iswh9WIKISaxrVni8gvtqHbgvv2mKDgVJIiC0uNr4C//sH7z9EdA==',key_name='tempest-keypair-1108392394',keypairs=,launch_index=0,launched_at=2023-04-17T22:25:16Z,launched_on='user',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='user',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='c15f6fa11d3549ceb94c8abb06c78f7e',ramdisk_id='',reservation_id='r-dhdwdd7p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='80deb4f3-99ac-4a4a-a04f-aa012f8a6b8f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='ide',image_hw_disk_bus='virtio',image_hw_input_bus=None,image_hw_machine_type='pc',image_hw_pointer_model=None,image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros-0.5.2-x86_64-disk',image_owner_specified.openstack.sha256='',owner_project_name='tempest-ServerStableDeviceRescueTest-1898719742',owner_user_name='tempest-ServerStableDeviceRescueTest-1898719742-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2023-04-17T22:27:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='0a3f4e1adcbf46308beb0bd8e35a6dcd',uuid=5eef06c2-982d-484a-9606-e8c5a2971850,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "address": "fa:16:3e:7f:4e:23", "network": {"id": "38a74fb3-65db-4af3-b3a0-81fb7ce3ba18", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-35556120-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.126", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c15f6fa11d3549ceb94c8abb06c78f7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9e56ee9-5e", "ovs_interfaceid": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) unplug /opt/stack/nova/nova/virt/libvirt/vif.py:828}} Apr 17 22:28:56 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-f2cc0676-7752-4baf-a4e0-378b8cdd0ef4 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Converting VIF {"id": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "address": "fa:16:3e:7f:4e:23", "network": {"id": "38a74fb3-65db-4af3-b3a0-81fb7ce3ba18", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-35556120-network", "subnets": [{"cidr": "10.0.0.0/28", "dns": [], "gateway": {"address": "10.0.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.0.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "172.24.4.126", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "10.0.0.1"}}], "meta": {"injected": false, "tenant_id": "c15f6fa11d3549ceb94c8abb06c78f7e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bound_drivers": {"0": "ovn"}}, "devname": "tapf9e56ee9-5e", "ovs_interfaceid": "f9e56ee9-5e39-4871-a635-202dfa5ab8d1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:511}} Apr 17 22:28:56 user nova-compute[71972]: DEBUG nova.network.os_vif_util [None req-f2cc0676-7752-4baf-a4e0-378b8cdd0ef4 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7f:4e:23,bridge_name='br-int',has_traffic_filtering=True,id=f9e56ee9-5e39-4871-a635-202dfa5ab8d1,network=Network(38a74fb3-65db-4af3-b3a0-81fb7ce3ba18),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9e56ee9-5e') {{(pid=71972) nova_to_osvif_vif /opt/stack/nova/nova/network/os_vif_util.py:548}} Apr 17 22:28:56 user nova-compute[71972]: DEBUG os_vif [None req-f2cc0676-7752-4baf-a4e0-378b8cdd0ef4 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7f:4e:23,bridge_name='br-int',has_traffic_filtering=True,id=f9e56ee9-5e39-4871-a635-202dfa5ab8d1,network=Network(38a74fb3-65db-4af3-b3a0-81fb7ce3ba18),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9e56ee9-5e') {{(pid=71972) unplug /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:109}} Apr 17 22:28:56 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:28:56 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9e56ee9-5e, bridge=br-int, if_exists=True) {{(pid=71972) do_commit /usr/local/lib/python3.10/dist-packages/ovsdbapp/backend/ovs_idl/transaction.py:89}} Apr 17 22:28:56 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:28:56 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:28:56 user nova-compute[71972]: INFO os_vif [None req-f2cc0676-7752-4baf-a4e0-378b8cdd0ef4 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7f:4e:23,bridge_name='br-int',has_traffic_filtering=True,id=f9e56ee9-5e39-4871-a635-202dfa5ab8d1,network=Network(38a74fb3-65db-4af3-b3a0-81fb7ce3ba18),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf9e56ee9-5e') Apr 17 22:28:56 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-f2cc0676-7752-4baf-a4e0-378b8cdd0ef4 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Deleting instance files /opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850_del Apr 17 22:28:56 user nova-compute[71972]: INFO nova.virt.libvirt.driver [None req-f2cc0676-7752-4baf-a4e0-378b8cdd0ef4 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Deletion of /opt/stack/data/nova/instances/5eef06c2-982d-484a-9606-e8c5a2971850_del complete Apr 17 22:28:56 user nova-compute[71972]: INFO nova.compute.manager [None req-f2cc0676-7752-4baf-a4e0-378b8cdd0ef4 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Took 0.85 seconds to destroy the instance on the hypervisor. Apr 17 22:28:56 user nova-compute[71972]: DEBUG oslo.service.loopingcall [None req-f2cc0676-7752-4baf-a4e0-378b8cdd0ef4 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=71972) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} Apr 17 22:28:56 user nova-compute[71972]: DEBUG nova.compute.manager [-] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Deallocating network for instance {{(pid=71972) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2264}} Apr 17 22:28:56 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] deallocate_for_instance() {{(pid=71972) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1793}} Apr 17 22:28:56 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:28:56 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:28:56 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:28:57 user nova-compute[71972]: DEBUG nova.network.neutron [-] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Updating instance_info_cache with network_info: [] {{(pid=71972) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} Apr 17 22:28:57 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Took 1.38 seconds to deallocate network for instance. Apr 17 22:28:57 user nova-compute[71972]: DEBUG nova.compute.manager [req-2eeed779-d792-4925-a51f-a87b29bf90a0 req-3b3c6f6f-3c77-4a86-b51c-8a3af5694a0d service nova] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Received event network-vif-deleted-f9e56ee9-5e39-4871-a635-202dfa5ab8d1 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:28:57 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f2cc0676-7752-4baf-a4e0-378b8cdd0ef4 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:28:57 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f2cc0676-7752-4baf-a4e0-378b8cdd0ef4 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:28:57 user nova-compute[71972]: DEBUG nova.compute.manager [req-75e13b29-21cd-4f23-b4d4-971404eae91d req-dfc4e020-c1fa-4eda-9da3-a51afbb5fab7 service nova] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Received event network-vif-plugged-f9e56ee9-5e39-4871-a635-202dfa5ab8d1 {{(pid=71972) external_instance_event /opt/stack/nova/nova/compute/manager.py:10982}} Apr 17 22:28:57 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-75e13b29-21cd-4f23-b4d4-971404eae91d req-dfc4e020-c1fa-4eda-9da3-a51afbb5fab7 service nova] Acquiring lock "5eef06c2-982d-484a-9606-e8c5a2971850-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:28:57 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-75e13b29-21cd-4f23-b4d4-971404eae91d req-dfc4e020-c1fa-4eda-9da3-a51afbb5fab7 service nova] Lock "5eef06c2-982d-484a-9606-e8c5a2971850-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:28:57 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [req-75e13b29-21cd-4f23-b4d4-971404eae91d req-dfc4e020-c1fa-4eda-9da3-a51afbb5fab7 service nova] Lock "5eef06c2-982d-484a-9606-e8c5a2971850-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:28:57 user nova-compute[71972]: DEBUG nova.compute.manager [req-75e13b29-21cd-4f23-b4d4-971404eae91d req-dfc4e020-c1fa-4eda-9da3-a51afbb5fab7 service nova] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] No waiting events found dispatching network-vif-plugged-f9e56ee9-5e39-4871-a635-202dfa5ab8d1 {{(pid=71972) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} Apr 17 22:28:57 user nova-compute[71972]: WARNING nova.compute.manager [req-75e13b29-21cd-4f23-b4d4-971404eae91d req-dfc4e020-c1fa-4eda-9da3-a51afbb5fab7 service nova] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Received unexpected event network-vif-plugged-f9e56ee9-5e39-4871-a635-202dfa5ab8d1 for instance with vm_state deleted and task_state None. Apr 17 22:28:57 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f2cc0676-7752-4baf-a4e0-378b8cdd0ef4 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:28:57 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f2cc0676-7752-4baf-a4e0-378b8cdd0ef4 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:28:57 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f2cc0676-7752-4baf-a4e0-378b8cdd0ef4 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.119s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:28:57 user nova-compute[71972]: INFO nova.scheduler.client.report [None req-f2cc0676-7752-4baf-a4e0-378b8cdd0ef4 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Deleted allocations for instance 5eef06c2-982d-484a-9606-e8c5a2971850 Apr 17 22:28:58 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f2cc0676-7752-4baf-a4e0-378b8cdd0ef4 tempest-ServerStableDeviceRescueTest-1898719742 tempest-ServerStableDeviceRescueTest-1898719742-project-member] Lock "5eef06c2-982d-484a-9606-e8c5a2971850" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 2.592s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:28:58 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:29:01 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:29:06 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:29:11 user nova-compute[71972]: DEBUG nova.virt.driver [-] Emitting event Stopped> {{(pid=71972) emit_event /opt/stack/nova/nova/virt/driver.py:1653}} Apr 17 22:29:11 user nova-compute[71972]: INFO nova.compute.manager [-] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] VM Stopped (Lifecycle Event) Apr 17 22:29:11 user nova-compute[71972]: DEBUG nova.compute.manager [None req-309a9e1a-6363-4450-880d-393c99217b9e None None] [instance: 5eef06c2-982d-484a-9606-e8c5a2971850] Checking state {{(pid=71972) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} Apr 17 22:29:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:29:16 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:29:21 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:29:26 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:29:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:29:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:29:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:29:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:29:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:29:31 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:29:36 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:29:41 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:29:46 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:29:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:29:48 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:29:50 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:29:50 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Starting heal instance info cache {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9792}} Apr 17 22:29:50 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Rebuilding the list of instances to heal {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9796}} Apr 17 22:29:50 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Didn't find any instances for network info cache update. {{(pid=71972) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9878}} Apr 17 22:29:50 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:29:50 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Cleaning up deleted instances with incomplete migration {{(pid=71972) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11117}} Apr 17 22:29:51 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:29:52 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:29:53 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager.update_available_resource {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:29:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:29:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:29:53 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:29:53 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Auditing locally available compute resources for user (node: user) {{(pid=71972) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} Apr 17 22:29:54 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:29:54 user nova-compute[71972]: WARNING nova.virt.libvirt.driver [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported. Apr 17 22:29:54 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Hypervisor/Node resource view: name=user free_ram=9202MB free_disk=26.596900939941406GB free_vcpus=12 pci_devices=[{"dev_id": "pci_0000_00_15_7", "address": "0000:00:15.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_4", "address": "0000:00:15.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_6", "address": "0000:00:17.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_4", "address": "0000:00:18.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "7190", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7190", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_1", "address": "0000:00:16.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_1", "address": "0000:00:17.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_5", "address": "0000:00:15.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_02_01_0", "address": "0000:02:01.0", "product_id": "07e0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07e0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_3", "address": "0000:00:16.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_7", "address": "0000:00:17.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_5", "address": "0000:00:18.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_3", "address": "0000:00:17.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_0", "address": "0000:00:16.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7191", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7191", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_7", "address": "0000:00:18.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_6", "address": "0000:00:16.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "7110", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7110", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_2", "address": "0000:00:17.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_0", "address": "0000:00:15.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_1", "address": "0000:00:07.1", "product_id": "7111", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7111", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_03_00_0", "address": "0000:03:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_2", "address": "0000:00:15.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_1", "address": "0000:00:15.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_5", "address": "0000:00:16.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_11_0", "address": "0000:00:11.0", "product_id": "0790", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0790", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_10_0", "address": "0000:00:10.0", "product_id": "0030", "vendor_id": "1000", "numa_node": null, "label": "label_1000_0030", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_2", "address": "0000:00:18.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_0b_00_0", "address": "0000:0b:00.0", "product_id": "07b0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07b0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_3", "address": "0000:00:07.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_3", "address": "0000:00:15.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_7", "address": "0000:00:07.7", "product_id": "0740", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0740", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_6", "address": "0000:00:18.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_2", "address": "0000:00:16.2", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_0", "address": "0000:00:17.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_15_6", "address": "0000:00:15.6", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_0f_0", "address": "0000:00:0f.0", "product_id": "0405", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_0405", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_0", "address": "0000:00:18.0", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_4", "address": "0000:00:16.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_4", "address": "0000:00:17.4", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_17_5", "address": "0000:00:17.5", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_16_7", "address": "0000:00:16.7", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_1", "address": "0000:00:18.1", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_18_3", "address": "0000:00:18.3", "product_id": "07a0", "vendor_id": "15ad", "numa_node": null, "label": "label_15ad_07a0", "dev_type": "type-PCI"}] {{(pid=71972) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} Apr 17 22:29:54 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} Apr 17 22:29:54 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} Apr 17 22:29:54 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Total usable vcpus: 12, total allocated vcpus: 0 {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} Apr 17 22:29:54 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Final resource view: name=user phys_ram=16023MB used_ram=512MB phys_disk=40GB used_disk=0GB total_vcpus=12 used_vcpus=0 pci_stats=[] {{(pid=71972) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} Apr 17 22:29:54 user nova-compute[71972]: DEBUG nova.compute.provider_tree [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed in ProviderTree for provider: 40de8fba-171e-4c3d-8cc3-30d210d6a26e {{(pid=71972) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} Apr 17 22:29:54 user nova-compute[71972]: DEBUG nova.scheduler.client.report [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Inventory has not changed for provider 40de8fba-171e-4c3d-8cc3-30d210d6a26e based on inventory data: {'VCPU': {'total': 12, 'reserved': 0, 'min_unit': 1, 'max_unit': 12, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 16023, 'reserved': 512, 'min_unit': 1, 'max_unit': 16023, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 40, 'reserved': 0, 'min_unit': 1, 'max_unit': 40, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=71972) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} Apr 17 22:29:54 user nova-compute[71972]: DEBUG nova.compute.resource_tracker [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Compute_service record updated for user:user {{(pid=71972) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} Apr 17 22:29:54 user nova-compute[71972]: DEBUG oslo_concurrency.lockutils [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.138s {{(pid=71972) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} Apr 17 22:29:56 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:29:56 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:29:56 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:29:56 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:29:56 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:29:56 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=71972) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10411}} Apr 17 22:29:58 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:29:58 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:29:59 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:30:01 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:30:01 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:30:01 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:30:01 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:30:01 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:30:01 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:30:06 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:30:08 user nova-compute[71972]: DEBUG oslo_service.periodic_task [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=71972) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} Apr 17 22:30:08 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] Cleaning up deleted instances {{(pid=71972) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11079}} Apr 17 22:30:08 user nova-compute[71972]: DEBUG nova.compute.manager [None req-f866c7f7-0da0-4af1-9805-1c95a094fd66 None None] There are 0 instances to clean {{(pid=71972) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11088}} Apr 17 22:30:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}} Apr 17 22:30:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:30:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe {{(pid=71972) run /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:117}} Apr 17 22:30:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:30:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE {{(pid=71972) _transition /usr/local/lib/python3.10/dist-packages/ovs/reconnect.py:519}} Apr 17 22:30:11 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:30:16 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 31 {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:263}} Apr 17 22:30:21 user nova-compute[71972]: DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout {{(pid=71972) __log_wakeup /usr/local/lib/python3.10/dist-packages/ovs/poller.py:248}}